2021 MIT Summer Research Program Research Forum

Office of Graduate Education, MIT


The 36th Annual MSRP (Virtual) Research Forum is an event celebrating the hardwork and scholarship of the 2021 cohort while exploring cutting edge research happening across the Institute. 

 

This year’s two-part virtual forum will include a formal program, featuring a keynote address by Dr. Carlos Ramos Rinaldi; 02’ MIT PhD, 97’,98’ MSRP and current Chair and Dean’s Leadership Professor in the Department of Chemical Engineering at the University of Florida.


Immediately following the formal program (2:00 PM ET), join the virtual poster session where you will have an opportunity to meet this year's 78 MSRP Research Interns and learn about their work with MIT faculty this summer. 


TO ATTEND A VIRTUAL POSTER DISCUSSION, click on the "chat" button during the poster session (2:00 pm - 3:00 pm, August 5, 2021) or click on the video button for the few posters featuring a pre-recorded video presentation.


Rewatch the formal program and keynote speaker here: http://web.mit.edu/webcast/msrp/


If you would like to provide feedback for any poster presentations you attended, please visit https://bit.ly/21msrpfeedback


More info: http://web.mit.edu/webcast/msrp/
Show Posters:

Back to top

Brain Switch-Chatbot: Enhancing Communication for People with ALS

Camila Acevedo Carrillo, Guillaume Hessel, Nataliya Kosmyna, and Pattie Maes

Abstract
Brain Switch: Enhancing the Communication for People with ALS

Camila Acevedo Carrillo1, Guillaume Hessel2, Nataliya Kosmyna3 and Pattie Maes3 1Department of Computer Science, University of Central Florida 2MIASHS Technology and Disability, University Paris 8 3Department of Media Arts and Sciences, Fluid Interfaces, Massachusetts Institute of Technology

The Brain Switch system is an innovative brain-computer interaction (BCI) tool which helps patients with Amyotrophic Lateral Sclerosis (ALS) and other motor function disabilities to communicate more effectively with their caretakers. However, there is a need for a system in which caretakers and family members of the patient can communicate with in case of any problems or doubts pertaining to the tool and its accompanying mobile application. The goal of this project is to develop and test an easily scalable and deployable chatbot to help improve communication between multiple caregivers, patients with ALS, and the Brain Switch research team and to share their experiences and difficulties with the Brain Switch system. The main priorities in selecting the correct stack for the chatbot implementation is to keep the patient’s information private, to make it easy and hassle-free to install for the users, it should be able to send and receive diverse media files, and it should communicate in multiple languages. The resulting chatbot system—which was developed using Landbot Chatbot Builder—corresponds to all of the aforementioned requirements. Moreover, the system is connected to a secure, online database that stores the daily survey and customer support participation data submitted daily by caretakers and family members. The chatbot is also very simple and accessible to deploy for users via a URL link which can be accessed and filled via multiple devices.
Presented by
Camila Acevedo Carrillo
Institution
University of Central Florida, Department of Computer Science

Creating a Spatial Sound Sculpture for More-Than-Human Exchanges

Jessica R. Mindel, Nicole L'Huillier, and Tod Machover

Abstract
In a volatile and intolerant cultural moment, the importance of connection has become increasingly evident. While the quantity of accessible communication technologies has increased, people may experience a poverty of meaningful connection, just as the Earth faces a poverty of resources during the Anthropocene. To resist this disconnect and emphasize the importance of all (non-)human bodies, we integrate (non-)Western perspectives to explore listening as a form of ritual practice, invisible architecture, and connectivity through collectivity. We present two sculptures that sonify the wind to understand how we might create conversation with non-human entities in auditory installations. We developed a novel membrane-based accelerometer microphone that interprets signals as collective vibrations, and a low-representational Markov model through which the sculpture autonomously co-designs rituals with the audience. We propose spatial, sound-based interactive systems that encourage wind-like audience behavior through call and response, promote alternative listening and grounding through transduction and unfamiliar filtering of realtime wind recordings, and challenge a prior focus in sonification literature on one-to-one data mappings, favoring ambiguity. Based on audience responses during the installation, we seek to develop generalizable protocols for the design of inclusive, more-than-human interaction that leads to a collective experience. We further hope to raise awareness of the importance of South American and non-human voices through our subversion of anthropocentrism and the Western ear.
Presented by
Jessie Mindel <jmindel@mit.edu>
Institution
University of California, Berkeley

Liquid Videos: An interactive educational experience for telecommunications

Gianna Williams, Aruna Sankaranarayanan, Andrew Lipman

Abstract
COVID-19 has devastated our health system, exacerbated our political system, and highlighted disparities within our education. The health crisis necessitated a shift from in-person to virtual learning. Students scrambled to adapt to this new learning format; however this proved difficult, especially for students with disabilities. Virtual learning platforms such as Zoom were utilized, but soon labeled as boring, redundant, uninteractive and excessively long. Liquid Video’s answers these complaints by condensing videos into a more compact time frame, creating more interesting, and therefore more beneficial, educational videos for the user. This is possible through Human Computer Interaction and Social Computing, rendering an accessible video platform that will learn to annotate videos based on the user's time. The methodology behind this platform is used with machine learning to segment the transcript of a video by using techniques of sentence classification and topic-based segmentation. We split the education videos into Intro, Review, Calculation and Conclusion. Currently our data is sourced by Khan Academy math videos. Utilizing this application we give students their own agency and autonomy, allowing them to watch what meets their needs. In this way, Liquid Video’s will make the educational platform accessible to all students, tailoring their experience to their abilities.
Presented by
Gianna Williams
Institution
Department of Media Arts and Science at Massachusetts Institute of Technology

Parametrized Templating of The U.S. Building Stock For Embodied Carbon Estimation

Lauren Moore, Ramon Weber, Caitlin Mueller, Christoph Reinhart

Abstract
In the total life cycle of a building, carbon emissions are incurred in all stages and are generally categorized into either embodied or operational carbon. Much of previous research and strategies for cutting building emissions have focused solely on operational carbon. However, this is not sufficient as recent studies suggest embodied carbon of structural materials accounts for at least 50% of life cycle emissions. With the built environment accounting for 40% carbon emissions in the United States alone, the urgency for better data and guidance on how to implement low embodied carbon strategies becomes apparent. Unlike operational energy, a proactive design approach is critical as the amount of atmospheric carbon is fixed once a building is constructed. This research extends existing methods for estimating embodied carbon of the U.S. building stock through accurate material quantification. In this research, we focus on the creation of an embodied carbon template for standardized apartment buildings. Generative design methods and automated workflows are used to analyze prototypical large-scale timber framing systems and their structural material quantities. Through the study of commercial construction details and building floorplans, we create a template for a prototypical apartment building. With the use of parameterized geometric and material variables, the model could be applied to a variety of buildings without the need for manual input. The resulting embodied carbon templates are an essential tool for early stage design decisions and have the potential to influence policy implementations, and estimate the environmental impact of new developments.
Presented by
Lauren Moore <lcmoore@mit.edu>
Institution
Howard University, Department of Civil and Environmental Engineering

Structural upcycling of trees: Applying traditional joinery mechanics to digital design with tree forks

Aldrin James Gaffud and Caitlin Mueller

Abstract
Tree forks are an underutilized element in the timber industry due to their variability and complex milling needs. In recent work, tree forks have been studied for their low environmental and economic cost in conjunction with their natural ability to perform well structurally. However, the question of how to mechanically join them with each other or other timber elements to create large structures, especially in a scalable, standardized manner, remains unanswered. This research focuses on engineering the traditional splice joint, a type of scarf joint, as a parametric connection mechanism to be used for complex structures built with irregular tree forks. The scarf joint was common in historical boat design and used to lengthen wooden boards and other timber elements when they were cut too short. Due to the joint’s geometric symmetry as well as the congruency between the connecting elements facilitating its fabrication, it is a compelling connection mechanism for modern structures made with nonstandard elements. In this research, the splice joint is explored in three ways. First, this research mathematically models the joint to predict its loading capacities. Second, the joint is digitally modeled in a CAD environment and parameterized using four design variables of structural relevance. Third, reclaimed tree branches are used to fabricate these connections to be tested in three-point bending. In future work, these experimental results will be compared to the original predicted structural capacities of the mathematical models. By expanding knowledge about the structural potential of scarf joints applied to tree forks, this work contributes to the movement towards environmental responsibility in architecture and engineering.
Presented by
Aldrin James Gaffud
Institution
1. School of Architecture, University of Florida 2. Department of Civil and Coastal Engineering, University of Florida

Systems Architecture Analysis for Innovation Dynamics of Emerging Co-Creation Practices

Manuel E. Torregrosa Cueto, Katlyn Turner and Danielle Wood

Abstract
The process for creating technology directly shapes who benefits from a technology and how it impacts societal disparities along characteristics such as gender, race, class, national origin, and sexuality. The proposed project creates new knowledge and evaluation methods to understand specific emerging innovation trends, while considering geographic effects. Specifically, this project describes the emerging innovation activities known as Co-Creation practices. It evaluates how these practices may impact the extent to which historically marginalized groups participate in innovation and experience beneficial outcomes in the context of several urban centers. MY contribution to this research is concentrated on assessing various case studies from sustainable energy and robotics organizations from the Boston area: MassRobotics, Greentown Labs, and UMass Lowell NERVE Center. Systems Architecture analysis is used to understand the systems contexts, objectives, stakeholders, and relationships of these organizations. This analytical process is a qualitative documentation derived from public information (web-based and social media platforms). Conceptual models are produced from this data to illustrate these systems connections. This inductive process will provide a preliminary insight into each case study and guide the subsequent phase of approaching the organizations for future interviews.
Presented by
Manuel Torregrosa <mtorregr@mit.edu>
Institution
Massachusetts Institute of Technology, Department of Media Arts and Sciences

Testing the Creative Potential of Pixel Play Through Workshops

Christina Marquez, Jaleesa Trapp, Kreg Hanning, Mitchel Resnick

Abstract
Lifelong Kindergarten designs technologies to foster creativity in youth. According to Dr. Mitchel Resnick, a technology is considered successful if users can produce a wide variety of projects. To achieve this, a technology should have a low floor (easy to get started), high ceilings (projects can grow more complex over time), and wide walls (multiple paths can be taken). One major way we test these technologies is through designing and facilitating workshops where youth interact with and test these devices. Using this method, we tested the current features and creative potential of Pixel Play, a device that allows you to program LED lights using Scratch blocks. Workshops were designed in accordance with Mitchel Resnick’s creative learning principles as outlined in his book, Lifelong Kindergarten. During play tests, participants behaviors and comments were observed and taken note of. We learned about the possible uses of Pixel Play, as well as current limitations of the device. Results revealed features that would improve Pixel Play, including the addition of an ‘IF’ statement block to the software and motion sensor capabilities to the hardware. With these results in mind, the design of Pixel Play needs to be iterated on and tested in larger workshop settings.
Presented by
Christina Marquez <marquezc@mit.edu>
Institution
University of California, San Diego , MIT Media Lab

Back to top

A Data Driven Decision Support Tool for Treatment Selection in Multiple Myeloma

Fernando A. Acosta Perez, Zeshan Hussain, David Sontag PhD

Abstract
Increasing the quality of life of cancer patients while making clinical practice more effective is a challenging problem in incurable progressive cancers like Multiple Myeloma (MM). One of the main concerns with respect to MM, is the lack of data-driven treatment selection techniques to make targeted decisions for specific patients. As part of this study, a discussion group with a set of oncologists was conducted to gather data about current needs within the treatment selection process in MM. Employing expert recommendations, a web-based decision support tool (DST) was designed using data from CoMMpass. The tool consists of a set of visualizations that can be stratified by patient demographics and medical records. It contains information about progression-free survival, adverse events, and longitudinal data from lab tests. Survival analysis models were parameterized using a Kaplan-Meir estimator and survival curves were plotted as a function of different treatments. To ensure statistical overlap when comparing different treatments, OverRule, an algorithm that finds overlapping groups by generating a set of boolean rules was fitted. Results from the discussion group confirm statements from the literature that claims that treatment selection for MM relies mostly on the physician's clinical acumen. In addition, preliminary results indicate that a decision support tool like the one developed in this study may lead to better clinical practice but experiments to test this hypothesis remain a subject of future research.
Presented by
Fernando Acosta Perez
Institution
Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science

A Point-of-Care platform for biomarker detection using inkjet printed microfluidic chips

Paola M. Morales Carvajal1, Kruthika Kikkeri2, Joel Voldman2

Abstract
Point-of-Care (PoC) technologies are promising platforms for the detection and monitoring of a variety of diseases. Although PoCs have potential for several healthcare applications, there are still challenges in translating these platforms to clinical settings. In particular, up-scaling these devices for manufacturing is challenging because current methods can be very expensive, time-consuming, have low yield. Thus, an alternative method to manufacture PoC devices in less time, with less equipment is needed to have a cost-effective platform. Here, we present a technique for the fabrication of low-cost microfluidic devices through the printing of electrodes. The electrodes are inkjet printed on polyethylene terephthalate (PET), and then bonded to a polydimethylsiloxane (PDMS) microfluidic chip. Carbon Ink (50mL Carbon Ink Metalon JR-700) was used for the electrodes array and different chemical techniques were implemented for bonding the PDMS structures to the printed electrodes to create a microfluidic chip. We characterized the printed electrodes and PDMS-PET bonding by examining the electrodes thickness, resitivity, adhesion of carbon to PET and delamination of assembled microfluidic chip. These chips will be used in an electrochemical immunoassay to detect inflammatory biomarkers. However, this microfabrication technique could be utilized for other microfluidic applications.
Presented by
Paola M. Morales-Carvajal <paolamc@mit.edu>
Institution
Biomedical Engineering, Polytechnic University of Puerto Rico, 00918

Atmospheric Methane Abatement via an Earth-Abundant Catalyst

Audrey Parker, Rebecca Brenneis, Desiree Plata

Abstract
Atmospheric methane (CH4) is 120 times more potent than carbon dioxide (CO2) and its reduction is necessary to limit a global temperature increase to less than 2℃, but there are currently no abatement technologies capable of low-level methane removal. Over the next 20 years the global warming potential of methane is 86-fold greater than carbon dioxide and in the next 10 years methane will contribute roughly equal climate forcing as CO2, despite having less than 0.5% its concentration (419 ppmv CO2 vs. 1.85 ppmv CH4). Current source control technologies only accommodate high concentration (>5%) sources, yet the primary sources of global methane emissions are spatially dispersed and diffuse. This project examines the efficacy of a biomimetic catalyst capable of low-level methane conversion. Zeolite (e.g. Mordenite, ZSM5, Y) powders and transition metals (e.g. Cu, Ni) were used to generate a catalyst containing reactive oxygen species, which effectively activate the C-H bond in methane. The copper zeolites and nickel Y demonstrated the ability to remove methane at atmospheric (1.85 ppmv) concentrations and temperatures as low as 250 ℃, over one hundred degrees cooler than previously reported. These findings highly suggest this class of catalyst to be an effective, sustainable, and low-cost form of low-level methane abatement.
Presented by
Audrey Parker
Institution
Department of Materials Science and Engineering, Boise State University

Characterization of RTK tyrosine phosphorylation in response to chemotherapy drugs in triple negative breast cancer

Andrea S. Flores Pérez, Jason E. Conage-Pough, Forest M. White

Abstract
Triple negative breast cancer (TNBC) is an aggressive subtype characterized by the absence of receptors commonly found in the disease. Due to the lack of critical biomarkers, there are no effective targeted therapies for TNBC and as a result, cases tend to have poorer prognosis and shorter relapse-free survival than other types of breast cancer. Among the most well-established therapeutic targets in breast cancers are receptor tyrosine kinases (RTKs). In TNBC, about half of the cases overexpress a subtype of RTKs called the epidermal growth factor receptor (EGFR) family. However, EGFR-targeted therapies have demonstrated poor efficacy in TNBC patients. Thus, there exists a critical need to better understand dysregulated signaling in TNBC tumors to improve treatment options for patients. In this study, we use liquid chromatography tandem mass spectrometry (LC-MS/MS) to measure signaling changes in TNBC in response to the genotoxic chemotherapy treatment Taxol and the EGFR inhibitor Erlotinib. We also evaluated the viability of TNBC cells in-vitro¬ in response to both drugs. Interestingly, Erlotinib appeared to prevent the Taxol-induced phosphorylation of several proteins without inhibiting EGFR phosphorylation. These results could reveal novel insights into intrinsic and acquired resistance to therapy in TNBC.
Presented by
Andrea S. Flores Pérez <aflores2@mit.edu>
Institution
University of Puerto Rico - Mayagüez

Characterizing Mechanical and Electrical Properties of Agarose Phantom to Model Stomach Tissue

Brandon Rios, James McRae, Giovanni Traverso

Abstract
According to the CDC, one in every 17 adults in the US suffer from a digestive disorder. Electrical stimulation of the stomach through mechanical or hormonal modulation is a promising therapeutic intervention to reduce nausea in patients, however, current FDA-approved devices require invasive implant procedures. The Traverso Lab at MIT is developing an ingestible device for the non-invasive delivery of electrical stimulation to the stomach for hormone modulation. This device, currently being tested in vivo in large animal (porcine) models, can successfully deliver electrical stimulation to the stomach mucosa to induce hormone modulation. In order to optimize the mechanical design to ensure reliable and targeted electrical stimulation, rapid prototype iteration and evaluation is critical. This has been pursued via a benchtop agarose-PBS phantom that accurately models the mechanical and electrical properties of stomach tissue. By characterizing the mechanical properties of agarose through tension and compression testing, the Young’s Modulus, a measure of stiffness, was acquired, and by comparing this data to actual stomach tissue mechanical data from the literature, a representative model of the different layers of the stomach wall was fabricated for the rapid testing of these ingestible devices on the benchtop for the therapy of gastrointestinal disorders.
Presented by
Brandon Rios <brios@wustl.edu>
Institution
Massachusetts Institute of Technology

Characterizing and Verifying Magnetorquer-based CubeSat Attitude Determination and Control System Hardware

Carlos A. Morell Rodríguez, Paula do Vale Pereira, Nick Belsten , Alex Choi, and Kerri Cahoy

Abstract
CubeSats are miniaturized satellites that provide a cost-effective way of obtaining on-orbit data, such as Earth observation. Embedded within CubeSats, specifically within the BeaverCube satellite (an MIT CubeSat that is set to launch in December 2021), is an Attitude Determination and Control System (ADCS), which senses and controls the position of the spacecraft. One common ADCS component is the magnetorquer. Magnetorquers control the angular momentum of a spacecraft by producing a magnetic field that interacts with the Earth’s magnetic field, producing a torque that will rotate the spacecraft to the desired direction. Magnetorquer measurements, however, present error and capture undesirable noise from the outside environment. The magnetorquer noise was captured and quantified by performing actuation tests inside a cleanroom environment where the error constraints (actuation time, sampling rate) within the system could be defined. Actuations from the magnetorquer were analyzed by observing magnetic field changes from nearby reference magnetometers. This method will increase the precision of the ADCS onboard BeaverCube. The actuation tests will also improve upon previous work of noise characterization performed onboard BeaverCube. The work done was considered a non-realistic assumption of mathematically perfect noise, e.g., Gaussian noise. The flight computer will now recognize spurious frequencies with a fully characterized noise, ensuring that BeaverCube points adequately at the Earth and successfully completes its Earth observation mission.
Presented by
Carlos Morell Rodriguez
Institution
University of Puerto Rico Mayaguez

Chemical Characterization of Heavy Metals for Healthy Agriculture

Chantaly Villalona, Kristen Riedinger, Lai Wa Chu, Desiree Plata

Abstract
Local agriculture is critical to public health and food security, providing produce that supplies essential nutrients, but may also contain harmful contaminants. Industrial pollutants are ubiquitous in the environment and can serve as a major source of pollution in the soil, water and air. Currently, exposure to chemicals by food and farms remains poorly documented. To address this, with Daedalus Software Inc., we seek to increase consumer access to food history through technology. Our analysis will inform this technology, providing the necessary chemical characterization data. This study will collect soil, produce, water and grass samples at 100 farms in Massachusetts (MA), varying in proximity to Environmental Protection Agency (EPA) Superfund sites. Using Inductively Plasma-Coupled Mass Spectrometry (ICP-MS) and Flame Atomic Absorption Spectrometry (FAAS), we identified and quantified a total of 22 metallic elements, including lead, copper, chromium and calcium. We aimed to understand the relationship between proximity to EPA Superfund sites and concentrations of “Superfund” chemicals in agricultural samples. Having mapped all EPA Superfund sites in MA alongside the 21 previously sampled farm locations onto ArcGIS; only seven were within ten miles of an EPA Superfund site. Preliminary observations display little to no relationship between proximity to the nearest EPA Superfund site and concentration of elements of interest, ultimately highlighting no distinguishable systematic contamination of MA farms by EPA Superfund sites. Additionally, most pollutant concentrations were below EPA maximum contaminant levels and resident soil screening levels, with the exception of a few outliers. Future studies will continue sampling and characterizing harmful chemicals prior to distributing reports to farmers and providing access to food history at the customer level to boost confidence in the nation's food supply.
Presented by
Chantaly Villalona <cvillalo@mit.edu>
Institution
Wellesley College & Department of Envrionmental and Civil Engineering, MIT

Deep Artificial Neural Network Model For Improved Thermospheric Density Prediction

Guilherme M. Eymael, Peng Mun Siew, and Richard Linares

Abstract
Given the rising number of satellites and space debris, there is an increasing demand for more efficient space traffic management. For that, it is critical to better determine and predict satellites’ orbits. Currently, the most significant source of error for satellites’ orbits prediction is the drag force calculation, due to inaccurate estimation of the thermospheric mass density. Therefore, we are working on a machine learning reduced-order model (ML-ROM) as a new approach for thermospheric density prediction to overcome the limitations imposed by other methods. Our model is trained via population based training and uses dynamic mode decomposition with control for data propagation in time. The results are then compared to a proper orthogonal decomposition reduced order model (POD-ROM) for validation. Our initial ML-ROM accuracy was found to perform similarly to the POD-ROM. Our ML-ROM’s inaccuracy gradually increases with time due to error propagation. Moreover, the most significant absolute percentage errors in the ML-ROM predictions correspond to times of high solar and geomagnetic activities. In conclusion, with better training and incorporation of a recurrent neural network for data propagation in time, our model has potential to allow for better thermospheric density prediction. This will decrease the probability of satellites/debris collisions, improving space traffic management.
Presented by
Guilherme Mainieri Eymael <eymael@mit.edu>
Institution
University of Nebraska-Lincoln, Department of Mechanical and Materials Engineering

Design of Novel RF Pulses for Fetal MRI using Rank Factorization (SLfRANK) to Reduce Heating and Improve Imaging Speed

Sebastian (Sebo) Diaz, Yamin Arefeen, Elfar Adalsteinsson

Abstract
Imaging in pregnancy proves to be a complex and challenging scenario in MR due to unpredictable fetal motion. Single-shot T2 weighted sequences, the gold standard for fetal imaging, use fast pulses with large flip-angle pulses to freeze fetal motion, maximize signal-to-noise, and provide adequate tissue contrast. However, the radiofrequency (RF) pulses deposit a significant amount of energy into the subject, denoted by signal absorption rate (SAR), reducing patient comfort and acquisition efficiency. Currently, these fetal imaging pulse trains hit the FDA limitations regarding patient safety. Embedded in the sequences are clinically designed excitation and refocusing envelopes intended to reduce SAR and limit dephasing. Nonetheless, these pulses still encounter SAR-related complications. Our group’s novel excitation and refocusing pulses replicate slice profiles while simultaneously reducing energy output and producing a more linear phase. We leverage SLfRank, a recently proposed algorithm that combines convex optimization and the Shinnar-Le Roux Algorithm to jointly solve for traditionally bi-coupled spin parameters. Our proposed excitation and refocusing pulse design reduces the energy output of the RF envelope by 9.75% and 22.55%, respectively, and maintains comparable excitation performance compared to clinically-designed pulses.
Presented by
Sebastian (Sebo) Diaz <sdd@mit.edu>
Institution
University of Arizona, Department of Biomedical Engineering

Designing multifunctional fibers for testing in vivo molecular fMRI probes

Nadir Talha, Miriam Schwalm, Atharva Sahasrabudhe, Polina Anikeeva and Alan Jasanoff

Abstract
To establish effective therapeutic interventions for various neurophysiological diseases, it is imperative to first understand the whole-brain connectivity and function of the mammalian brain. Most neural probing devices record and act on only one of many different types of neural signals to develop a better understanding of the brain’s organizational levels and their functions. The goal of this project is to create multifunctional fibers that allow for electrical deep brain stimulation (DBS), photometry measurements and/or optogenetic control and contrast agent delivery, simultaneous to functional Magnetic Resonance Imaging (fMRI) scans. These multifunctional fibers are made from polymer and composite materials (e.g. carbon nanotubes) that are MR compatible. An angular implantation method is used to allow for a better fit of the animal in the MRI coil, compared to 90-degree implantation. Our devices were implanted either in the cortex of rats previously injected with viral vectors expressing GCaMP6f, or in the hippocampus of rats previously injected with the synthetic calcium indicator Oregon Green BAPTA-1AM, allowing for photometric data to be collected simultaneously to high field MRI. These multifunctional fiber devices are suitable for probing the brain in long-term chronic experiments to track brain connectivity and for testing novel imaging agents which can be infused directly at the DBS stimulation and/or photometry recording site.
Presented by
Nadir Talha
Institution
Case Western Reserve University, Department of Electrical Engineeing

Determining Bitcoin's Load Flexibility

Max DiGiacomo, Aliza Khurram, Micah Ziegler, Jessika Trancik

Abstract
Intermittent sources of renewable energy like solar and wind suffer from periods of overproduction and underproduction. Flexible loads can respond to these situations by using energy that would otherwise be curtailed, or by shutting off in response to a shortage of energy. Loads that are sufficiently flexible can provide further revenue to energy suppliers or support grid reliability, and thereby help support the deployment of renewable energy technologies. Bitcoin mining is one potential source of flexible load. Organizations with heavy exposure to Bitcoin have modeled mining as a source of demand response and suggested that miners could allow larger amounts of solar capacity to be installed. Still, these organizations do not account for hardware or cooling limitations in their demand response models. In this project, we will explore the practicality of demand response given the operational flexibility and/or physical limitations of industrial scale mining facilities (e.g., ramp up times of mining hardware, thermal management requirements for facilities, etc). To this end, we will first estimate the energy consumption of the Bitcoin network, then determine which system attributes are the most significant contributors to this energy consumption. Finally, we will look at how well suited each system attribute is to partake in demand response for supporting renewables adoption.
Presented by
Maximino DiGiacomo-Castillo
Institution
Stanford University

Determining Optimal Conditions for Intimin Protein Expression for the Detection of Pathogenic Escherichia coli in Food Products

Esteban G. Bermúdez-Berríos, Yining Hao, Hadley D. Sikes

Abstract
Food-borne pathogens are a significant problem in food safety worldwide, affecting about 600 million people and resulting in approximately 420,000 deaths. The development of tests for the detection of pathogens is one of the most important steps towards preventing diseases from getting out of control. By having tests available, affected populations can be monitored to plan the response to outbreaks of pathogenic bacteria or viruses. If testing devices that identify surface proteins of pathogenic bacteria are designed to require easy preparation and provide results quickly with high reliability, they can be used in regions with limited access to healthcare facilities and specialized equipment. In this study, we seek to determine the optimal conditions to overexpress intimin, a protein specific to pathogenic E. coli strain O157:H7, to be used as the analyte for detection. Our results suggest that intimin is better expressed at lower temperatures but needs a plasmid that induces the cells to overexpress the protein into inclusion bodies. Previous studies have only used intimin as a protein for display on the bacterial membrane surface. Therefore, although intimin is a potential candidate for E. coli detection, improved expression of intimin is necessary so that it can be purified and used to find binders with high affinity. Developing tests that can be used in settings where food products are processed will help in achieving improved food safety and nutrition everywhere in the world, especially in low- and middle-income countries.
Presented by
Esteban Bermúdez-Berríos
Institution
Department of Chemical Engineering, University of Puerto Rico - Mayagüez

Development of a Heat Flux Sensor for High Heat Flux Environments

Tobi Majekodunmi, Alina LaPotin, Dr. Asegun Henry

Abstract
One of the primary barriers to the widespread use of renewable energy is the lack of means to store excess energy for later use (e.g., solar panels do not generate energy at night). The Atomistic Simulation and Energy Research Group at MIT is developing an energy storage system that leverages the principles of heat transfer to store vast amounts of energy—Thermal Energy Grid Storage (TEGS). In TEGS, electrical energy is converted into thermal energy, which is conserved using insulation, and subsequently converted into electrical energy by thermophotovoltaic (TPV) cells. To measure the efficiency of the TPV cells, you must know the amount of heat absorbed by the cells. The heat absorbed can be measured using a heat flux sensor (HFS), however, commercially available HFSs possess a high thermal resistance, which causes an increase in the TPV cell’s temperature that leads to a decrease in its efficiency. This project’s goal was to develop a HFS with a low thermal resistance. Using the Seebeck effect (the voltage a material develops based on the temperature difference between two points), we explored the use of direct bond copper, silicon, and a thermopile configuration of nickel, copper, and aluminum oxide as potential HFSs.
Presented by
Tobi Majekodunmi <tmajek05@gmail.com>
Institution
Department of Mechanical Engineering

Diagnosing Magnetic Reconnection in Simulations of Tokamak Scrape-off Layer Turbulence

Reehan Siraj, Noah Mandell, Nuno Loureiro

Abstract
Improving our understanding of plasmas is important to help us develop widely available sustainable energy in the form of fusion energy. However, the many different fluid and electromagnetic interactions make the study of plasmas challenging. One region of laboratory plasmas that is particularly difficult to study is the tokamak scrape-off layer (SOL), or the edge region of the plasmas where they interact with the storage device, called a tokamak. The Gkeyll code has recently demonstrated the first capability to simulate gyrokinetic turbulence in the SOL where fluctuations in the magnetic field are accounted for. Including these fluctuations means that there may be magnetic field lines breaking and reconnecting in these simulations. This phenomenon, known as magnetic reconnection, can be difficult to diagnose in these three-dimensional regimes. In this work we develop a set of analysis tools that searches for signatures of reconnection and suggests locations where it may be occurring. This will enable further study of reconnection in the SOL and its impact on SOL dynamics and transport. Here, we apply these analysis tools to Gkeyll simulations, where we find evidence that magnetic reconnection is occurring in the simulations. These analysis tools can later be applied to other systems with three-dimensional magnetic reconnection, such as the solar corona or other astrophysical plasma systems.
Presented by
Reehan Siraj
Institution
William & Mary, Department of Physics

Effectively Reducing Salt Accumulation with Shape Memory Alloys

Aliyah Osman

Abstract
Unlike commercial desalinators that are costly and require a lot of energy, solar desalinators can be a sustainable and affordable way of acquiring drinkable water. The desalination process removes the salt in seawater, but at a cost of salt accumulation which decreases the efficiency of the process. To counter this, shape memory alloy springs and hydrogel can be used to effectively reduce salt accumulation. The springs are used to make a passive day-night cycle that submerges hydrogel at night and raises it at day from a container of water. When the hydrogel is submerged in water, it allows the salt to detach and dissipate away and therefore renew the desalination process. To simulate the day-night cycle, a solar simulator which mimics sunlight is turned on and off, respectively. As the solar simulator runs, the real-time mass change due to evaporation is recorded. This data will determine the evaporation rate and solar-to-vapor conversion efficiency. The device’s salt rejection is evaluated by a salinity test. Additionally, a force balance model is developed to check the viability of the desalination device. Solar desalinators can provide a solution to the water crisis that worsens each year. This research will help with advancing the use of solar desalinators as a means of providing clean water at a low price.
Presented by
Aliyah Osman
Institution
Ohio State University, Department of Engineering

Efficient Ground Vehicle Navigation Using Aerial Images

Jehan Shalabi, Jacopo Banfi, and Nicholas Roy

Abstract
One of the most significant challenges in robotic navigation of previously unknown environments is the need to find efficient routes that avoid dead ends. Having robots work as teams to provide additional information about the terrain can help solve this problem. In this project, we consider a small team of two heterogeneous, autonomous robots composed of a ground vehicle and quadrotor. As the quadrotor hovers at high altitudes, it easily acts as an additional long-range sensor for the ground vehicle. Aerial images of the environment can be used by the ground vehicle to determine which regions currently beyond its sensing range are traversable. The ground vehicle then uses this new obstacle map to plan paths less likely to reach dead ends. To differentiate between objects in an environment and create an obstacle map, RGB thresholding is used. This obstacle map is then used to plan a path for the ground vehicle using the A* search algorithm. Using a high-fidelity simulator, we show that the proposed system can be used to plan efficient paths. The results of this research can help improve and expand the use of robots in exploring novel terrains for search and rescue, military surveillance, autonomous package deliveries, agriculture, mapping, and inspection.
Presented by
Jehan Shalabi <jshalabi@mit.edu>
Institution
New Jersey Institute of Technology, Department of Electrical and Computer Engineering

Engineering Binding Protein for Salmonella Detection in Food Production

Christian Lubin, Yining Hao, and Hadley D. Sikes, PhD

Abstract
Xibus Systems has previously sponsored research in the Hadley D. Sikes Laboratory to discover thermostable binding proteins for Listeria monocytogenes, an established food pathogen widely regarded as problematic. Xibus’ current work is to discover additional thermostable binding proteins, specific to not just Listeria, but also Salmonella and pathogenic Escherichia coli (E. coli). In looking to engineer a binding protein specific to Salmonella, we are hoping to better detect the pathogenic bacteria in food production. Detection of this bacteria in production would prevent the contraction of related foodborne illness, which can result in severe sickness and even death. In this project, the Rapid Affinity Pair Identification via Directed Selection (RAPIDS) process will be utilized, which enables the identification of affinity reagents that function together as complementary pairs, from in vitro libraries of about 109 variants. The process identifies complementary pairs that bind to separate epitopes without binding to one another or non - targets through the application of selective pressure to hundreds of thousands of potential affinity pairs. Going into the project, the intended application of the engineered protein was through food safety sensor devices, which can indicate the presence or absence of Salmonella within minutes. A diagnostic like this is valuable because it fosters safer food production and protects the health of countless consumers.
Presented by
Christian Lubin
Institution
Morehouse College, Department of Chemistry

Estimating Electron Transfer Kinetics for Flow Battery Electrodes using Dense Carbon Films

Akram B. Ismail, Charles Tai-Chieh Wan, Alexander H. Quinn, Fikile R. Brushett

Abstract
Redox flow batteries (RFBs) are promising technologies for the efficient and reliable delivery of electricity, offering opportunities to integrate intermittent renewable resources and to support unreliable and/or aging grid infrastructure. Within the RFB, the porous carbonaceous electrode provides surface area for redox reactions, distributes electrolyte, and conducts electrons. Understanding reaction kinetics of the electrode is crucial towards improving RFB output and lowering costs. However, reaction kinetics are driven by an interplay of factors and the complex geometries of porous electrodes invalidate the assumptions in conventional voltammetric techniques used to assess electron transfer kinetics, thus frustrating our understanding of performance descriptors.

Here, we outline a strategy to estimate electron transfer kinetics on electrode materials reminiscent of those used in RFBs. First, we describe a bottom-up synthetic process to produce non-porous and planar carbon films to enable evaluation of electron transfer kinetics using traditional electrochemical techniques. Next, we characterize physicochemical properties of the films using a suite of spectroscopic methods. Last, we assess the performance of the films in a custom-designed cell architecture, extracting intrinsic heterogeneous kinetic rate constants in iron-based aqueous electrolytes using standard electrochemical methods (i.e., cyclic voltammetry, electrochemical impedance spectroscopy). We anticipate that the methods and protocols described in this work are broadly applicable for quantitatively assessing electrocatalysts.
Presented by
Akram Ismail <aismail4@mit.edu>
Institution
University of Rochester, Department of Chemical Engineering

Exploring Transition Metal Complex Space with Computation and Artificial Neural Networks

Adriana Ladera, Chenru Duan, Vyshnavi Vennelakanti, Heather J. Kulik

Abstract
Transition metal complexes (TMCs) are promising molecular systems of interest due to their broad applications in catalysis, sensing, and energy storage. However, due to their complex electronic structure, TMCs present unique challenges for existing computational electronic structure methods. Density functional theory (DFT), one of the most widely used electronic structure methods, is prone to be inaccurate in predicting properties of TMCs, and property predictions can disagree significantly depending on the choice of density functional approximation (DFA). Given the difficulties in their property evaluation, TMCs could be of interest for benchmarking electronic structure method development beyond DFT. We evaluated the total atomization energy (TAE) of selected TMCs using 23 DFAs that had varying levels of accuracy. We then trained artificial neural networks (ANNs) to learn the TAEs of these TMCs for each of the 23 DFAs separately, targeting TMCs in which relative DFA disagreement on TAE was large, indicating complex electronic structure. We can then build a workflow that would identify the TMCs with the most complex electronic structure via targeting TMCs with large disagreement across DFAs, and yield a benchmark set for the development of new electronic structure methods.
Presented by
Adriana Ladera
Institution
University of South Florida, Department of Computer Science and Engineering

Extending the performance of the STRUCT-e hybrid turbulence approach through consistent mesh refinement

Alice Ding, Emilio Baglietto

Abstract
Computational fluid dynamics (CFD) has been used largely for the design of safety related issues such as identifying how unsteady turbulent flow in nuclear systems can limit the structural performance of critical components. Current methods such as direct numerical simulation (DNS), unsteady Reynolds averaged Navier-Stokes (URANS), and large eddy simulation (LES) are limited by numerous factors such as resolution, flow resolvable through Reynolds number, or computational cost. As a result, hybrid methods have been proposed to leverage the specific strengths of the different methods. One such method is the STRUCT-e model, which uses URANS in quasi-equilibrium regions, while adopting an LES-like method in regions where coherent turbulent structures appear. We propose to further this hybrid method by incorporating automatic mesh refinement (AMR) consistently with the hybrid activation, increasing the computation mesh resolution in regions of interest. Our method allows for improvements in predictions about the fluid flow in regions of high turbulence while keeping the mesh resolution low in other areas in order to reduce their impact to computational costs.
Presented by
Alice Ding
Institution
Massachusetts Institute of Technology, Department of Nuclear Science and Engineering; Vanderbilt University, Department of Mechanical Engineering

Feature Selection for Image Classification

Mohini Anand, Kai Xiao, Shibani Santurkar, Aleksander Madry

Abstract
Image Classification is a machine learning method that attempts to predict the class of objects present in each image. There are numerous models that have been developed with great performance and functionality. However, there is little understanding about how they work and make predictions. The aim of this work is to investigate whether the models focus on the important features while making predictions. To this end, we leverage saliency maps, which are a standard primitive in the field of model interpretability to gain insight into how important each pixel in the image is for the model's prediction. We use pre-annotated segmentation maps, which are objects manually labelled by humans beforehand in images and compare them with the saliency maps of standard as well as robust models. Our analysis shows that standard E.R.M. trained classifiers do not always focus on the object of interest, but also on other features in the objects' surroundings. While we find preliminary evidence that this problem is slightly alleviated by adversarially training classifiers, there still remains significant room for improvement.
Presented by
Mohini Anand
Institution
NYU Tandon School of Engineering

HIGH TEMPERATURE CHARACTERIZATION of p-GaN Gate AlGaN/GaN HEMTs

Mohamed Fadil Isamotu, Mengyang Yuan, Tomás Palacios

Abstract
At high temperatures (over 300°C), conventional semiconductors such as Gallium Arsenide (GaAs) and silicon (Si), are fundamentally limited by their narrow bandgap (1.44 eV and 1.12eV respectively) and high intrinsic carrier density. For most conventional semiconductor-based electronics to operate at temperatures greater than 150 °C, they must be coupled with external cooling systems introducing more weight, complexity, cost, and often noise to the final device. Electronic devices that can reliably operate at such temperatures without cooling mechanism would be beneficial for a wide variety of fields, including, the automotive, aerospace, petroleum, and geothermal industry. Gallium (GaN) as a wide bandgap semiconductor (WBG, 3.4eV), known for its high thermal stability and inert nature, is a promising candidate for high temperature electronics. Moreover, the polarization nature of GaN enables the implementation of AlGaN/GaN HEMTs by forming a high-quality two-dimensional electron gas (2DEG) in the heterojunction. To prove the potential of GaN transistors for high temperature applications, it is critical to characterize the devices at the targeted operating temperatures for an extended period of time. In this study, we characterized self-aligned normally-off Gate Injection Transistors (GITs, p-GaN Gate AlGaN/GaN HEMTs) with the etch stop process and refractory metal gate, optimized for large scale integration and high temperature operation. 500°C survival tests were first carried out on devices with and without wire bonding pads for over 20 days, and ex-situ measurements were conducted during the test at room temperature to evaluate the devices’ performance, thermal stability of wire bonding, and analyze the potential degradation mechanism for further optimization. The unpadded devices showed improved performances after survival tests, which can be attributed to the improved ohmic contact and gate controllability. On the other hand, the devices with wire bonding pads showed high contact resistance and early velocity saturation due to the degraded connection between bonding pads and ohmic contacts. An automatic platform was then developed for in-situ measurement to monitor the devices’ performance and potential degradation under DC stress at 500°C in real time. The devices showed stable operation at 500°C for over 24h.
Presented by
Mohamed Fadil Isamotu
Institution
Morgan State University, Department of Electrical and Computer Engineering

How are Chromatin Loops Formed in Brain Cells? Investigating CTCF and Cohesin Loop Extrusion

Michele Gabriele, Hugo Brandão, Asmita Jha, Hansen Anders

Abstract
Understanding how gene expression is regulated is important to physiology and pathology. Gene regulation controls human development from a molecular level to the complex differentiation of tissues and organs in the human body. DNA can form long-range interactions that can create chromatin loops and topological association domains (TADs) as a higher-order genome organization. TADs are structural domains that can be also involved in the interaction between enhancers and promoters, thus regulating gene expression. CTCF and cohesin proteins are involved in extruding loops from the chromatin in a dynamic manner. Mutations can disrupt the interaction between DNA, CTCF, and cohesin, which can cause developmental alterations and cancers. There is little known about the mechanism of gene regulation through CTCF and cohesin loop extrusion, especially since previous studies do not take into account how often they interact over time and space. We investigate how the CTCF protein and cohesin interact to create loops inside the nuclei of mouse embryonic stem cells differentiated into neurons to influence gene regulation and create the 3D structure of the genome through 3D cell imaging super-resolution.
Presented by
Isadora Rocha De Abreu
Institution
Nova Southeastern University, Department of Psychology

InfraredTags: 3D Printed, Instant-Read Invisible Codes using Mobile Infrared Cameras

Akshat Kumar, Mustafa Doga Dogan, Stefanie Mueller

Abstract
We present InfraredTags, which are invisible codes embedded in 3D printed objects that can be scanned using infrared cameras, allowing computers and mobile devices to interact with the object. The key idea is to efficiently tackle the scanning and detection of codes which are invisible to the human eye, for real-time applications. Unlike previous research, InfraredTags are not only unobtrusive and cost-effective to fabricate, but also allow users to apply existing and widely used code types such as QR codes or ArUco markers, instead of a custom code.

We introduce InfraredTags embedding interface that allows users to implant codes anywhere in any 3D shape without sacrificing the look, feel, or functionality of the object, finally printing that object in order to detect it using the InfraredTag imaging tool. This will enable new ways and applications to interact with any 3D objects, such as using existing/arbitrary objects as pointers or game controllers or remotely controlling appliances and devices in an augmented reality environment. Finally, we evaluate the accuracy of our method under different lighting conditions, when objects were printed with different filaments, and with pictures taken from various positions and angles.
Presented by
Akshat Kumar <akshat1k@mit.edu>
Institution
University of Illinois at Chicago

Investigation of delivery methods for long-term GI wireless electroceuticals

Adrian Florea, Kewang Nan, Giovanni, Traverso

Abstract
Obesity is a major affliction that leads to many chronic diseases and is a major world health concern and implantable technologies, like electroceuticals show great promise for treatment. Electroceuticals utilize electrical stimulation to target hormonal and neurological pathways that control food consumption. These devices are implanted into the stomach surgically and have little capability for delivering patient specific therapy. This project focuses on exploring the feasibility of an ingestible device that can deliver wirelessly powered electroceutical therapy. Deployment and anchoring of a wirelessly charging antenna were identifies as crucial tasks for this device and I explored four groups of materials that could be suitable this this application. Through iterative optimization and benchtop experimentation, I developed a reliable shape memory polymer prototype. This prototype was evaluated qualitatively in a swine in-vivo model where an external magnet controlled the location that the prototype was deployed.
Presented by
Adrian Florea
Institution
Vanderbilt University, Mechanical Engineering

Investing the effects of numerical parameters and mesh resolution within Exasim code.

Jevon Ashman, Dominique Hoskin, Wesley Harris

Abstract
Modeling fluid-structure interactions in hypersonic airflows with high definition is of great interest in the aerodynamics field because of the need to predict skin-friction and heat transfer on aircraft bodies. To accurately model these computational fluid dynamics (CFD) cases, Exasim code is being used to numerically solve parametrized partial differential equations (PDEs). In order to qualify the model’s accuracy, numerical parameters and the mesh resolution are tweaked between different cases and compared to previous data. Understanding how these initial conditions affect the accuracy of the model will garner better simulations that can predict hypersonic turbulent flows with more precision and accuracy. To provide meaningful insight to improve the Exasim code, numerical parameters and mesh resolutions will have the same settings as previous numerical simulations and then be compared. Afterwards, these initial conditions will be varied to investigate how the simulation’s stability, accuracy, and efficiency are affected.
Presented by
Jevon Ashman <ashmanj@mit.edu>
Institution
Morehouse College, Division of Mathematics and Computaional Sciences

Language Based Image Editing with Neuron Captions

Teona Bagashvili, Evan Hernandez, Jacob Andreas

Abstract
How can we automatically edit an image given a language description of the edit? One way is to look at generated images and manipulate the models that generated them. We focus on Generative Adversarial Networks(GANs). GANs are networks that can generate photo-realistic images from scratch. We show that it is possible to perform fine-grained, localized edits of GAN outputs by selectively activating neurons based on their descriptions. Hernandez et al have recently developed a method for generating descriptions of individual neurons in deep networks. It has been used for analyzing network behavior, however, we're going to demonstrate that it can also be useful for changing network behavior. We select the neurons based on the textual similarity between the language description of edit and neuron description. To activate the selected neurons we compare three different ways of modifying the neuron activation values. 1. Find the maximum activation value for each neuron across the large collection of generated images and edit the image by setting the relevant neurons to the respective activation values. 2. Find optimal activation values using gradient descent 3. Maximize the activation values by multiplying them by the constant value. The preliminary results show that the first and the second methods are more effective at making photo-realistic edits to the image. However, compared to the second method the first approach ensures that only the user-selected region of the image gets affected.
Presented by
Teona Bagashvili
Institution
Allegheny College, Department of Computer Science

Learning-Enabled Optimal Quantum Control

Maison Clouatre, Mohammad Javad Khojasteh, Moe Z. Win

Abstract
Optimal quantum control (OQC) has grown in importance as the fields of quantum information science and quantum computation grow. For instance, in order to perform logical operations on quantum bits (qubits), an external control field is required to manipulate the state of the qubit. However, OQC requires a mathematical model of the underlying quantum system, and such a model may be difficult to obtain a priori. In this work, we propose a novel quantum tomography based Hamiltonian learning (HML) algorithm which uses physical experiments to identify the internal and control Hamiltonians which govern the dynamics of the quantum system. Our approach involves an original optimization-based quantum process tomography algorithm defined over the complex Stiefel manifold, i.e., the set of unitary operators, to ensure physically meaningful predictions. This approach requires less memory and is more computationally efficient than state-of-the-art quantum tomography based HML algorithms. Once the dynamics of the system have been identified, OQC is used to generate optimal control sequences in a computer-based simulation of the quantum system using the learned model. Once control sequences are generated, they are given to a physical quantum system in an open-loop fashion in order to preserve coherence of its state. Both theoretical error bounds and numerical simulation support the efficacy of the proposed approach.
Presented by
Maison Clouatre
Institution
Department of Electrical & Computer Engineering, Mercer University; Laboratory for Information & Decision Systems, Massachusetts Institute of Technology

Lithium Dendrite Behavior in Ceramic Solid-State Electrolyte

Micah Thorpe, Cole Fincher, Yet-Ming Chiang

Abstract
Solid-state electrolyte (SSE) lithium-ion batteries are a safer alternative to liquid electrolyte lithium-ion batteries. Lithium dendrite formation is a phenomenon that is observed in both liquid electrolyte and solid electrolyte systems, where lithium ions travelling across the cell deposit in branches across the cell. Once lithium dendrites grow to the opposite electrode of a battery, the battery is shorted; causing battery failure. Understanding of lithium dendrite formation in battery cells is difficult since the internal behavior of a battery cell during cycling is mostly observed through tracking current and voltage of the cells during cycling and ex-situ analysis. In order to further understand the behavior of lithium filaments in SSEs, a microscopically recorded probe system is used to propagate and visualize dendrite formation and SSE crack growth. Various intervals of constant current are applied to the system. It is expected that with current increases, the dendrite will grow at a faster rate. Sudden increases in current tended to result in cracks splitting from one to multiple crack tips. The ability to instigate crack branching can delay the failure of a battery by diverting dendrite growth away from the opposite electrode.
Presented by
Micah Thorpe
Institution
Massachusetts Institute of Technology

Low Temperature Pressureless Sintered Molybdenum-Chromium Alloys via Nanophase Separation Sintering

Samuel Figueroa, Christian Edward Oliver, Yannick Naunheim, Christopher Schuh

Abstract
Nanocrystalline alloys have gained a great deal of interest amongst both the scientific community and the metallurgy industry due to their exceptional physical properties, owing to their very fine microstructures. However, sintering of metallic powders to fully dense nanocrystalline parts is a challenge due to the tendency for microstructural features to coarsen under rigorous thermal processing cycling. Low temperature processing accelerated sintering techniques are thus necessary for the realization of these materials, with conventional low temperature processing techniques such as liquid-phase sintering being incapable of achieving fine microstructures. Herein, we propose the use of a novel sintering technique using micron-sized molybdenum (Mo) solvent powder particles with nanocrystalline grains supersaturated with a chromium (Cr) solute to move towards the development of a bulk nanocrystalline material. This is based on previous results with tungsten (W) and nickel (Ni) Cr nanocrystalline materials using the same technique. These powder alloy systems upon heating begins to phase separate to form a solute-rich phase with nanostructured necking between solvent powder particles to support rapid diffusion of atoms, resulting in sintering behavior at a low temperature ~950°C. Without applied pressure processing, binary alloy powders sintered to 1500°C already reach full density. Sample microstructures are imaged using optical and scanning electron microscopies, with results strongly implying very fine microstructural features. Further work will focus on computational studies regarding energy stability within the Mo-Cr system to glean further insight into moving towards creating a true bulk nanocrystalline alloy, as well as expanding this sintering technique to other binary alloy systems.
Presented by
Samuel Figueroa
Institution
Massachusetts Institute of Technology

Manipulating PEGylated PAMAM Dendrimers and Comparing Cartilage and Meniscus Partitioning to increase the efficacy of Post Traumatic Osteoarthritis Treatment

Laila N.J. Hayes, Brandon M. Johnston, Dr. Paula T. Hammond

Abstract
Osteoarthritis (OA) is a painful disease that affects up to 30 million people each year. This disease manifests as the degradation of articular cartilage and ultimately leading to the exposure of underlying bone. Potential disease-modifying OA treatments have been studied; however, these locally injected biologics require a drug delivery system to increase joint residence time and therapeutic efficacy. Previous studies have shown that cationic polyamidoamine (PAMAM) dendrimers covalently modified with polyethylene glycol (PEG) can adhere to the densely anionic aggrecan chains of articular cartilage through electrostatic interactions and improve the efficacy of covalently-bound therapies. Meniscus tissue is capable of electrostatic interactions with the cationic dendrimer as well, which reduces the amount of dendrimer-bound therapy uptaken by the target tissue. This research evaluates the partitioning of PEG-PAMAM conjugates into cartilage and meniscus using an ex vivo bovine model. By testing a small library of conjugates, we can identify the most effective PEG chain length and chain density in order to achieve effective accumulation in cartilage. This research used proton nuclear magnetic resonance spectroscopy (1H-NMR) and ex vivo tissue in a salt-based assay to characterize the PEG-PAMAM conjugates and an ex vivo, co-uptake experiment examining dendrimer uptake into cartilage and meniscus simultaneously to better simulate partitioning in the joint. For all the conjugates tested, articular cartilage has a higher uptake percentage than meniscus. This difference in uptake, or partitioning, is more significant for PEG with 4 repeat units (PEG 4) than PEG 40 for all grafting densities tested. For both chain lengths, however, higher chain densities resulted in greater partitioning than lower chain densities. Thus, these findings suggest that a shorter PEG chain and greater PEG density results in more significant partitioning into cartilage. Based on this information, we can improve the efficacy of dendrimer-bound therapies by controlling conjugate trafficking in the joint.
Presented by
Laila Hayes <hayeslnj@gmail.com>
Institution
Spelman College Department of Chemistry and Dual Degree Engineering; Massachusetts Institute of Technology Department of Chemical Engineering

Neutralization of Monopolar Electrospray Propulsion System via Low-Earth Orbit Plasma Environment

Ymbar Polanco Pino, Oliver Jia-Richards, and Paulo Lozano

Abstract
Typical electric propulsion systems emit positively-charged ions and require the use of a neutralizer that emits electrons in order to maintain spacecraft charge neutrality. For small spacecraft on the CubeSat scale and smaller, the operation of both a propulsion system and it's neutralizer can impose significant requirements on the power system that can limit the overall capability of the spacecraft. However, in low-Earth orbit the surrounding atmospheric plasma environment could be used to neutralize the spacecraft thereby eliminating the need for a neutralizer. The neutralizing species from the surrounding plasma environment will depend on the polarity of ions emitted by the propulsion system; a system that emits positive ions will require a flux of positive ions from the plasma environment while a system that emits negative ions will require a flux of electrons. Due to the greater mobility of electrons relative to ions, a propulsion system that emits negative ions will be able to emit far greater current without charging the parent spacecraft. This work analyzes the feasibility of using an ionic-liquid electrospray thruster emitting negative ions for propulsion of a small spacecraft in low-Earth orbit. A theoretical assessment of the maximum expected electron current at different solar activities is provided in order to determine the maximum thrust output of the propulsion system while avoiding spacecraft charging. An experimental validation of the concept is conducted in a vacuum environment in order to demonstrate that a monopolar propulsion system can produce thrust without the need for a neutralizer. The use of such a system could reduce the overall power requirements of propulsion systems for small spacecraft and improve their capabilities in order to enable more-affordable access to space.
Presented by
Ymbar Polanco Pino <ymbarp@mit.edu>
Institution
Massachusetts Institute of Technology, University of Missouri - Columbia

One-Shot Lexicon Learning for Low-Resource Machine Translation

Anjali Kantharuban, Jacob Andreas

Abstract
Machine translation is a task that entails translating natural language text with no human involvement. Advanced translation models allow people to communicate across language barriers. Current methods struggle to translate phrases that contain words that appear infrequently in the training data. Low-resource languages in particular are harmed by this because they have such little data that rare words may only be seen once. This has been partially addressed using the lexical translation mechanism, which runs input tokens through a lexicon, or word-level translation table, and outputs them based on contextual information. These lexicons are either built manually, requiring human intervention, or using statistical methods, requiring many training samples. We propose a model which acts as an additional layer in a translation network and generates word alignments using the syntactic structure of training pairs. This model can both generate more accurate lexicons and better adapt to out of vocabulary tokens in new data. Most importantly, this model can allow translation networks to generate lexicon entries for words seen only once during training, improving one-shot translation. This can allow translation systems to better serve low-resource language speakers and show insight on how other tasks can be adapted to function with less data.
Presented by
Anjali Kantharuban
Institution
University of California, Berkeley

Optimizing Vaccine Storage

Manuel Cortés, Byungji Kim, Ryan Hosn, Darrell Irvine

Abstract
The ability to maintain vaccine efficacy from synthesis, to storage, and lastly administration is among the most important clinical-setting considerations. Due to the enzymatic susceptibility of RNA vaccine technology, optimization of the cryoprotectant and storage conditions is an essential step. Here, we explore sucrose as a cryoprotectant with varying concentrations (0%, 5%, 10%, 30% in saline), solvents (phosphate-buffered saline and tris-buffered saline), and storage temperatures (4°C, -20°C, -80°C, and liquid nitrogen) for an RNA-loaded lipid nanoparticle (LNP) formulation for HIV. In vitro data from dynamic light scattering (DLS) and cryogenic electron microscopy (Cryo-EM) narrowed down the leading storage temperatures to 4°C and -20°C, as they maintained a polydispersity and hydrodynamic diameter comparable to the freshly synthesized batch. For the in vivo experiments, IVIS bioluminescent imaging showed PBS saline to provide the strongest signal, while LNPs with a concentration of 10% sucrose exhibited the strongest immunological activity using an enzyme linked immunosorbent assay (ELISA). Consolidating these results, we proved that storage at -20°C in PBS at a 10% sucrose concentration optimizes RNA-LNP stability and immunological activity. Overall, this study has important implications in achieving sustainable production and distribution goals for vaccines in a global setting.
Presented by
Manuel Cortes <mecortes@mit.edu>
Institution
Massachussetts Institute of Technology, Department of Biological Engineering

Parameter Sensitivity Study on Non-Hydrostatic 2-D Subduction Dynamics

Sophia Keniston, Chris Mirabito, Patrick Haley Jr., Nevan Lim, Pierre Lermusiaux

Abstract
Ocean mixing is when bodies of water of different properties interact, advected by current, waves, or wind. It is one example of a sub-mesoscale (< 10 km) 2-D flow that would ultimately affect the output of a larger mesoscale model when resolved numerically, but the impact is generally unknown. The 2.29 Non-Hydrostatic Finite Volume MATLAB Framework (MSEAS-FV) was developed by the MSEAS-MIT lab to isolate and analyze sub-mesoscale 2-D flows and ocean properties. The code uses the 2-D Navier-Stokes equations with Boussinesq approximations to calculate flows and was configured with constant wind and current forcing. The work done in this research involved varying the value of the horizontal density diffusion coefficient (K1) to analyze the effect it had on the oscillations produced by the subduction of heavy water that had been introduced to an area of constant density. A concurrent phase of the research involved implementing non-constant time-dependency, specifically for the wind stress applied along the top boundary condition. Results from simulations were compared to collected and constructed data from the 2019 CALYPSO Sea Experiment and the MSEAS-MIT Primitive Equation Model. Experiments showed that oscillations and velocity were noticeably altered when K1 and the wind stress were changed.
Presented by
Sophia Keniston
Institution
Sweet Briar College

Predicting Small-Molecule Substrate Reactivity via Data Mining and Machine Learning

Jackson Burns, Priyanka Raghavan, Connor Coley

Abstract
Rapid development of pharmaceutical reagents is of the utmost interest for our collective health. Small molecule pharmaceuticals in particular are highly sought-after chemicals. Unfortunately the synthesis of these species can take months to discover and optimize because of a reliance on human-driven experimentation. Machine learning (ML) can be used to accelerate this process by building models to predict reaction conditions and yield and guide experimentation. Existing literature contains massive volumes of data for said ML but it is not being used effectively at scale. Using a database of published chemical reactions, over 300 unique transformations involving samarium iodide from more than 200 separate publications are identified and used to build an accurate ML model. This family of reactions is of particular interest due to its complex relationship to solvent, which is not well-modeled using current approaches. Each reaction is parameterized into machine-interpretable representations using molecular fingerprints and common DFT parameters like Fukui values and partial charges generated using a pre-trained neural network. The resulting model will enable chemists to optimize their reactions using literature precedent more rapidly. This approach could also be used to automate chemical space exploration by allowing a computer to predict conditions for and then set up its own experiments. The ML model created is pre-trained and can be distributed as-is for researchers interested in samarium iodide chemistry and will also be extended to more diverse chemistries in the near future.
Presented by
Jackson Burns <jwburns@mit.edu>
Institution
Department of Chemical Engineering, University of Delaware; Department of Chemical Engineering, MIT

Profiling of tRNA modification patterns in human embryonic kidney cells under oxidative stress

Fatima Madondo, Jingjing Sun and Junzhou Wu

Abstract
Recent research has shown that cells dynamically regulate modifications on transfer RNA (tRNA) to manage survivor genes expression in response to stress, which is based on biased use of synonymous codons that match the modification altered tRNAs. The tRNA modification patterns have been observed to be highly variable in bacteria, yeast, rodent, and human cell lines. My research is to study the tRNA modification changes in human embryonic kidney cells (HEK293) under oxidative stress, which has not been reported before. The hypothesis is that HEK293 cells change a specific group of tRNA modifications in response to oxidative stress. To measure this, we exposed HEK293 cells to hydrogen peroxide, isolated tRNA after treatment, hydrolyzed tRNA into mono-nucleosides, and quantified relative modification changes by liquid chromatography-mass spectrometry (LC-MS). The results indicated that many wobble modifications significantly decreased after hydrogen peroxide treatment, supporting that oxidative stress-induced tRNA modification reprogramming may take place in HEK293 cells, further studies are needed to clarify its linkage with stress-essential genes regulation.
Presented by
Fatima Madondo <fmadondo@mit.edu>
Institution
Lebanon Valley College

Quality Analysis of Slope-Assembled Opals

Cameron Kilpatrick, Carlos Diaz, Evelyn Wang

Abstract
Self-assembly of micro-to-nanoscale particles has recently gained traction as a nanofabrication technique due to the process’s ability to create highly ordered microstructures with no need of manual control of the individual components. Slope assembly in particular merits further exploration as it is a particularly fast method, allowing for rapid fabrication with high-throughput and order. The orderliness of a self-assembled opaline array can be quantified in terms of its quality, which in turn can have a strong influence in the structure performance. Quality values range from 0 to 1, with numbers closer to 1 being more orderly and thus desired. While slope assembly can reliably create good-quality samples on glass, it is not successful on metals, which are substrates critical to many potential applications. To remedy this, in this work, a transfer method is used to first form an opal layer on glass, remove the layer by submersion in water, and pick up by the desired material. Using an image processing algorithm, we found that while glass samples can consistently produce quality values above 0.7, transferred samples featured large cracks and vacancies causing some samples to have quality ratings of as low as 0.4. However, in spaces not affected by these vacancies, copper deposited on glass and mirror-polished copper produced almost identical structures and quality values. Our results indicate that substrate transfer is a valid approach with the potential to allow for the fabrication of slope self-assembled structures on arbitrary substrates. Such structures can have applications in thermo-fluidic devices and electrochemical systems.
Presented by
Cameron Kilpatrick
Institution
MIT Department of Mechanical Engineering

Refocusing Analog Electronic Pedagogy to Emphasize Practical Skill Development within a PsoC Environment

Xavier Smith, Eric Ponce, Dan Monagle, Nicolas Hougardy, Steven Leeb

Abstract
Electrical circuits stand at the center of design for providing functionality to many modern products. In order for Electrical Engineers in training to be as innovative and creative as possible when designing an electronic solution to a problem, hands-on experience is a critical part of undergraduate and graduate education. The complexity of modern products and components makes it harder to keep practical hardware examples in front of students. We seek to address these challenges by tailoring an analog electronics curriculum solely around efficient, practical implementations of electrical signal theory within the voltage, current, and other signal limiting constraints of a modern embedded system called the PSoC, or Programmable System On-Chip. The curriculum is a handbook full of concise, optimized hands-on labs with the goal of familiarizing students with a real-world environment in a microcontroller and helping them internalize effective electronics techniques. We have worked to isolate the overarching characteristics of specific signal processing / analog circuitry concepts and interfacing them with peripherals on-board a PCB was a pedagogically valuable delivery mechanism for helping students understand both the theoretical and practical. Our plan is to launch the finished product for students to use during their undergraduate career, not only invoking a problem-solving thought process within them, but giving them the tools they need to become mature, seasoned, Electrical Engineers with the ability to confidently approach electronic projects.
Presented by
Xavier Smith <xsmith1@umbc.edu>
Institution
University of Maryland, Baltimore County; University of Massachusetts, Institute Technology

Selective Kinase Inhibitors of the TYK2 Protein

Tyra Jones, Keir Adams, David Graff, Connor Coley

Abstract
Small molecule pre-clinical drug discovery takes 31 months to complete and costs $474 million and more targeted discovery can reduce these costs. The current workflow involves target identification, target validation, lead identification and optimization, and preclinical development. This study seeks to aid the process finding more selective drug candidates and expand on current literature which trains surrogate models using docking scores to predict kinase selectivity. The goal of this research was to create a screening methodology for new drug candidates that will favorably bind to a target protein and reduce the downstream effects of off target binding, using the TYK2 kinase, related to diabetes, as a case study. Computational methods were used to dock 10,000 ligands against the TYK2 kinase and 4 off target kinases. The project goal will be realized through data collection, analyzing the data, training a surrogate model on the data, applying the model to a virtual library of 2M molecules, and comparing the prediction to the docking results. This new methodology can reduce the cost of the preliminary search for small molecule therapies and reduce off target effects. Diminished off target effects can minimize unwanted side effects of therapies.
Presented by
Tyra Jones
Institution
Spelman College and Georgia Institute of Technology, Chemistry and Chemical and Biomolecular Engineering

The Effect Of Data Augmentation on Deep Representations

Phuc Ngo, Dimitris Tsipras, Saachi Jain and Aleksander Mądry

Abstract
Data augmentation is a simple and common technique that increases the model’s robustness to class-preserving transformations. However, our understanding of how data augmentation affects deep representations is limited. In this work, we attempt to study this further and hypothesize two mechanisms that could happen. The earlier layers of the model could map augmented inputs to similar representations to the standard inputs counterpart. Or, the model could use an entirely different set of prediction rules to classify augmented samples. To test the hypothesis, we trained standard and augmented models to analyze the similarity between their predictions and representations. Our results suggest data augmentation has a range of behavior on deep representations. Depending on the severity of the augmentation, models can vary between learning invariance or learning entirely separate augmented subpopulations.
Presented by
Phuc (Jerry) Ngo <ngoph@beloit.edu>
Institution
Beloit College, Department of Computer Science and Math

Towards Closed Loop Modulation of Gut Neural Circuits using Fiber Integrated Flow Sensors

Jesse de Alva, Atharva Sahasrabudhe, and Polina Anikeeva

Abstract
Multifunction fibers have been used to bring a variety of stimulation and recording methods to diverse organ systems in freely behaving animals. However, use of the fibers for stimulation of gut neural circuits has been limited due to the lack of feedback within the gut. In order to achieve a closed loop system, we propose the integration of a miniaturized thermal flow sensor within the fiber to allow for operation in a non-contact mode. The focus of this work was to design, fabricate and characterize such a flow sensor as well as to optimize its role in a closed loop system. The realization of the project’s goals involved both the confirmation of the microscopic flow sensor to reliably track a food/fluid bolus and the validation for a full electronic system for implantation in the body within the fiber. The miniaturized flow sensor consists of a thermal actuator and multiple thermal sensors located upstream and downstream from the actuator parallel to the flow direction to reliably detect a difference in temperature during the passage of the bolus. The resulting fiber prototype showed the ability to accurately detect the movement of material and appropriately trigger a stimulation event, proving to create a closedloop electronic system.
Presented by
Jesse de Alva <jpgarcia@ucsd.edu>
Institution
Massachusetts Institute of Technology, Department of Materials Science and Engineering

Towards Richer Experiences with Accessible Data Visualizations

JiWoong Jang, Arvind Satyanarayan

Abstract
Though they are increasingly viewed as a critical component in all manner of modern Web-based discourse, today’s data visualizations are often inaccessible to blind/low-vision users, despite the proliferation of web standards and accommodating technology like screen-readers. Current best practices of providing Accessible Rich Internet Application (ARIA)/alt-descriptions and interleaved data tables have not shown to provide experiences sufficiently comparable to those of sighted users. This project, and future work evolving from it, seeks to advance current understanding about possible forms accessible data visualizations can take on the Web, such that screen-reader users can meaningfully and effectively navigate and comprehend the semantics encoded in the visualization. Towards this goal, we propose a formalized design space for data visualizations accessible for screen-reader users, one informed by the intersection of interaction design, disability studies, and visualization task taxonomy literature. In this design space, spanned by information abstraction, data structuring, navigation, and sensory output modalities, we provide a partial exploration with prototypes developed through a participatory design process. The results from this project offer preliminary recommendations for developers of web accessibility standards, screen-readers, and visualization authoring tools on providing support for the creation of usable and accessible data visualizations, ones capable of providing rich experiences for blind/low-vision users.
Presented by
JiWoong Jang <jwjang@mit.edu>
Institution
Carnegie Mellon University, Massachusetts Institute of Technology

Transfer Learning with Constraints

Diego Castro Estrada, Alireza Fallah, Asuman Ozdaglar, Ashia Wilson

Abstract
Machine learning has proven to be a useful tool for solving problems in fields ranging from computer vision to medicine. However, the large amount of data required for training a model can make solving some problems difficult or outright impossible. Transfer learning is a framework developed specifically to tackle this weakness. Generally, transfer learning consists of leveraging the knowledge held by source models previously trained to perform a task similar to the desired one. This knowledge is then used as a shortcut to create a target model that performs satisfactorily on the desired task. However, simple transfer learning is still not enough to solve constraint-based problems (like those requiring that a model be differentially private or fair). We propose a modification to the usual transfer learning framework that will enable its use in such constrained settings. We train the source model and the target model simultaneously, minimizing a joint cost function with a distance term and a number of indicator functions for constraint sets, which ensure that the target model complies with our specifications. We find that preliminary results show promise and that they indicate that our framework might help expand the use cases for machine learning in the future.
Presented by
Diego Castro Estrada
Institution
Florida International University

Using Nano-Engineering for Enhanced Manufacturing of Advanced Carbon Fiber

Carlos Catalano, Carolina Furtado, Palak Patel, and Brian L. Wardle

Abstract
Composites are materials that are made up of a matrix (resin) and a fiber (carbon). In this project, NECSTLAB proposes a new, more efficient, and less expensive method to cure composite materials when compared to commercial autoclave methods. The method consists of using a nanoporous network (NPN) between composite plies that, through capillary pressure, can reduce void content within carbon fiber plies, without the need of external pressure (typically needed in composite manufacturing). This method has been shown to preserve composites’ mechanical properties without the cost of using an autoclave . Firstly, out of three different thicknesses of the same NPN material, the thinnest one was chosen to do new L-shaped samples since it had the least effect on the overall sample thickness and the interlaminar region, while being effective at void removal. The L-shape was chosen to assess the NPN’s efficiency to remove voids in complex geometries. Three types of L-shape samples were manufactured: No NPN, NPN, and cured in an autoclave (also without NPN, the “baseline”). They were then cut and scanned under an X-Ray microscope for void content. As expected, both “NPN” and “autoclave” samples had no significant void content while “No NPN” did have a significant amount of voids. Future work includes performing a four-point bending test and comparing performances of all three types. If results are as expected, this new manufacturing procedure can replace autoclave curing method and lower production costs of composite materials.
Presented by
Carlos Catalano
Institution
Department of AeroAstro, MIT

Using Supervised Machine Learning Methods to Create a Gene-Based ALS Predictor from Postmortem Transcriptomics Data

Christopher Bain, Divya Ramamoorthy, Ernest Fraenkel

Abstract
Amyotrophic Lateral Sclerosis is a fatally progressive, paralytic disorder characterized by the degeneration of motor neurons in the brain and spinal cord. Typically, death due to respiratory paralysis occurs within 3 to 5 years. While several pathogenic mutations have been identified, the vast majority of ALS cases have no family history of disease. Thus, for most ALS cases, the disease may be a product of multiple pathways contributing to varying degrees in each patient. To further assess this case, we use logistic regression and other supervised machine learning methods to analyze a set of ~300 patients’ genetic profiles in order to identify clusters of patients with similar transcriptomic markers. Identification of such clusters would allow us to construct an algorithm that can take in a patient’s genes and make an accurate prediction for their likelihood of developing ALS in the future. Moreover, finding significant similarities in genes of patients could give crucial insights to the future of neurodegenerative research and further care for ALS patients. We were able to construct an agent that was able to predict if a patient had ALS with 86.7% accuracy. In addition, using the coefficient scores from the method we were able to find correlations between the heavily weighted genes and compensatory mechanisms the body implements in ALS patients. We believe Logistic Regression to be a strong classifier for identifying both the likelihood of a patient having ALS as well as identifying correct weights for specific genes.
Presented by
Christopher Bain
Institution
University of Maryland, Baltimore County, Department of Physics and Biomedical Engineering

Utilizing OmpW to Determine the Presence of the Foodborne Pathogen Salmonella

Kiara Lewis, Yining Hao, Hadley Sikes

Abstract
Salmonella is a bacterium that can be found in foods such as meats, eggs, and vegetables. Consuming a food contaminated with this bacterium can result in serious illness and possibly death. It is especially hazardous to pregnant women who can pass it to their child. Before, testing for Salmonella took days, time that some consumers may not have. The purpose of this project was to utilize the outer membrane protein W, or OmpW, found in Salmonella to create a test that can then identify the bacterium within minutes. The project utilized the Rapid Affinity Pair Identification via Directed Selection (RAPIDS) process. This included buying a plasmid containing the protein so that it could be extracted, purified, and used to find binders. This allowed the identification of a protein that would bind to the OmpW, thus when put in the test, would inform the presence of the pathogen. A test like this is groundbreaking. It will not only save companies that use it money, as they will not have to recall millions of dollars’ worth of food products and prevent any lawsuits that could come about. It can also save the lives of the many consumers who would be put at risk if a Salmonella outbreak occurred.
Presented by
Kiara Lewis <kklewis@mit.edu>
Institution
Spelman College, Department of Chemistry

Back to top

Access to Sexual and Reproductive Health Materials

Maida Raza, Dr Sally Haslanger

Abstract
Gender inequity is a pressing issue in Oyugis, Kenya. A widely cited reason for this is the relatively low level of secondary education amongst girls. One might wonder why girls drop out of school earlier than boys? There are several possible hypotheses, including: teenage girls get pregnant, or they contract HIV/AIDS (or other STIs). These hypotheses beg the question as to why girls get pregnant or contract HIV/AIDS. Several studies have found a link between lack of sexual education available for youth and higher cases of HIV/AIDS and pregnancy amongst them. We are interested in exploring how lack of access to tangible and intangible Sexual and Reproductive Health (SRH) resources might be responsible for higher school dropout amongst girls. Thus, we focus on the questions: What kind of sex-ed is being provided in schools and churches, if any? How influenced is it by religious restrictions and sex stigma? And lastly, what sort of policies and practices exist around the availability and pricing of contraceptives, abortion, and STI testing, diagnosis, and treatment.

As background, in Homa Bay County, of which Oyugis is part, 94% of boys and 69% of girls complete primary school. The Population Council reports that 66% of females between the ages of 13-19 cite adolescent pregnancy as an imperative reason for dropping out of school. Moreover, Kenya’s National HIV (2020) survey shows that the prevalence of HIV/AIDS among women is 6.6%, compared to 3.1% in men.

It is hard to determine whether teenage pregnancy and/or STIs are causing lower levels of education amongst women, vice versa, or both, in a looping process. Therefore, we are interested in exploring whether youth access to SRH tangible and intangible resources - including sex education, contraception/abortion, and treatment for STIs - in Kasipul Sub County, Kenya, contribute to girls’ higher dropout rate.

Based on our findings, we will provide practical recommendations to our partners in Kenya to combat teenage pregnancy, rampant cases of HIV/AIDS, and other STIs amongst adolescents and keep girls in the school. We hope that these improvements will contribute to long-term improvements in gender equality levels in Oyugis, Kenya.
Presented by
Maida Raza
Institution
Earlham College, Department of Economics and Mathematics

From Investment to Influence? The Effects of the Belt and Road Initiative on Participating States

Naomi Aladekoba, Taylor Fravel, Eleanor Freund,Nick Ackert, Emma Campbell-Mohn, John Minnich, Raymond Wang.

Abstract
China’s Belt and Road Initiative has raised international concerns aboutover possibly engaging in ‘debt trap diplomacy’ and the possibility that China could use using this foreign policy initiative to assert its dominance in the international system. However systematic research on the impact of the BRI has yet to be conducted, and therefore, it is difficult to determine whether current criticisms of the BRI are overstated or of reasonable concern. The purpose of this project was to provide qualitative analysis on specific case studies, to assess the impact of BRI construction contracts on participating states and explore potential pathways for greater Chinese influence. The case studies highlight countries with the greatest absolute and relativepercent changes in Chinese construction contracts after joining the BRI. They are therefore the most likely cases for greater Chinese influence, following the assumption that greater economic interactions in the asymmetrical economic relationship between China and BRI countries can translate to potential pathways of influence. Results indicate that for some countries that experienced greater changes in economic interactions, China is able to exercise some political influence through public/ elite opinion (increasing its soft power), and is able to garner support on key issues of salience. However, it is important to note that influence is difficult to measure, and alignment on certain issues may be a result of previous ties with China that predate the BRI.
Presented by
Naomi Aladekoba
Institution
Department of International Studies, Spelman College

Multiple Sessions Support Learning in Virtual Reality

Camila Lee, Meredith Thompson, Eric Klopfer

Abstract
Well-designed immersive virtual reality experiences motivate users by providing a rich, interactive environment, however, this complex immersive environment can cause significant cognitive load. This study aimed to understand how to optimize learning in immersive virtual reality (IVR) to help students gain conceptual knowledge of cells. A pilot study was conducted with six participants who experienced the IVR game Cellverse twice, approximately one week apart. Three of the 6 participants were given a goal during the first session, three were told to explore; during the second session, those instructions were switched. Four forms of data were collected: video and audio recordings, questionnaires, drawings, and short semi-structured interviews. Participants’ cell drawings suggest that they learned during the first session, retained information between sessions, and continued to learn new information about cells throughout the study. Participants reported that both explore and goal-oriented sessions were useful, but tended to prefer the most recent session, suggesting a recency effect. Results from this study indicate that educators should include multiple sessions of an IVR experience to support learning. Future analysis will investigate the impact of activity design on presence, agency, and cognitive load.
Presented by
Camila Lee
Institution
Wellesley College, Department of Computer Science

Political Dynasties in Brazil: Quantification through Text Mining and String Matching

Octavio E. Lima, F. Daniel Hidalgo (Ph.D.)

Abstract
Political Dynasties remain common across a wide range of countries today. Nonetheless, their scope has been relatively unexplored. The greatest challenge of this type of research is quantifying dynasties. In that sense, the existing studies are encouraging but limited. Previous scholars aimed to quantify dynasties solely based on the last names of political candidates. This creates substantial room for errors; last names alone are not good predictors, especially because some are very common. Using Brazil as a case study, we overcome this problem by webscrapping the full names of the parents of Mayoral candidates. We then grouped these names by state to further increase the accuracy of the data. To find out which Brazilian states had the highest percentages of dynasties, we compared our webscrapped names with an existing dataset of past politicians using Approximate String Matching. The methodology used in this paper may bring light to the unknown consequences of dynasties. By making our dataset publicly available, we encourage further studies to be done in this field. Specifically, we suggest that future scholars investigate why Northeastern states in Brazil have the highest rates; how the Sarney family influences the results in Maranhão; and whether dynastic candidates are wealthier than non-dynastic ones on average.
Presented by
Octavio E. Lima <octavioe@mit.edu>
Institution
University of Oregon, Department of Economics

Reflections on Remote Learning: Examining Educational Equity through the COVID-19 Pandemic

Harley Gutierrez, Chris Buttimer, Natasha Esteves, Farah Faruqi, Aicha Soukab

Abstract
The landscape of PreK-12 education experienced rapid changes in the United States over the course of the COVID-19 pandemic. Remote learning has become widespread, drawing attention to facets of education that may call for examination as schools prepare to open in-person this fall. The shift to remote learning exacerbated pre-existing societal inequities in and out of school settings. To understand teaching under these circumstances, the Teaching Systems Lab investigated the lived experiences of educators across the country, centering those who are Black, Indigenous, or other people of color and who serve marginalized communities. Utilizing a qualitative, interview-based approach, we collected teacher perspectives that highlight potential best-practices for schools to adhere to as education continues to change. Using thematic analysis, we classified the data among the following themes: 1.) what worked during the 2020-21 academic year, 2.) what did not work during the 2020-21 academic year, and 3.) what educators recommend moving forward. Initial findings suggest that humanizing classrooms and curricula has been effective for teaching, restricting autonomy and maintaining inflexibility have been ineffective. Additionally, the narrative that students are experiencing a “learning loss” has been challenged, and there is a capacity for reimagining and redesigning schools to make them more humane and responsive when educators’ input is valued.
Presented by
Harley Gutierrez
Institution
University of Texas at Austin, Department of Psychology

State Legislatures’ Efforts to Restrict and Expand Voting Access​

Christina James and Charles Stewart III Ph.D.

Abstract
Voting gives constituents the opportunity to be involved in government and advocate their interests. Upholding the constitutionality and equity of elections is essential to fostering healthy democracy and elections. This study examines recent efforts by state legislatures to expand and restrict voting access. We measure the degree to which state legislatures have furthered legislation to expand or restrict voting access in 2021. This measure combines the number of restrictive/expansive provisions and how far along the legislative process these provisions advanced. Our measures of expansion and restrictiveness are correlated with measures of state government partisan control and partisan competition in the state. We find that restricting voting access is predicted by state competitiveness and partisan control of the state government. Expanding access is predicted by partisan control alone. The results of this project provide insight into legislative trends and can help to foster awareness of state election health and voting access. Future research may seek to compare the results aggregated within the United States with the results in other nations.​
Presented by
Christina James
Institution
Department of Political Science, Spelman College​

UFOs, apparently: Evidential strategies and epistemic adverbs in scientific discourse on Twitter's #ufotwitter community

Maya Návar, Casey Hong, Graham Jones

Abstract
As an increasingly legitimized site of scientific discourse, social media offers unique insight into the interactional nature of data stabilization--the process by which information comes to be accepted as evidence across a community. Using frameworks rooted in linguistic anthropology and interactional stance, we examine the evidential strategies used by members of #ufotwitter, a community of ufo enthusiasts for whom evidence is a frequently-discussed and highly-contested topic. Our project focuses on the use of epistemic adverbs to modify various forms of evidence. We compiled a list of epistemic adverbs using the categorizations proposed by Wierzbecka (2006) and Quirk et al (1985) and used the Twitter API to query tweets containing those adverbs plus the term “#ufotwitter”; we also ran a network analysis of the community to identify distinct epistemic communities and users with a large following. Drawing upon Goffman’s model of participation, we developed a coding schema that identifies whether an adverb does epistemic work on a phenomenon, author, animator, principal (socially-responsible figure), or representation of evidence. The next step in our project is to manually code the tweets using this schema and compare them across the epistemic communities found in our network analysis. We expect our findings to highlight the relationship between language, data, evidence and authority in online scientific discourse.
Presented by
Maya Návar
Institution
Stanford University, Department of Linguistics

What Makes J-PAL RCTs Datasets Popular?

Ximena Mercado Garcia, Sarah Kopper, and Jack Cavanagh

Abstract
Data sharing can be beneficial for the research community, as it allows for data re-use, can help answer questions on external validity and generalizability, and enables the replication and confirmation of results. The Abdul Latif Jameel Poverty Action Lab (J-PAL), a global research center working to reduce poverty by ensuring that policy is informed by scientific evidence, archives datasets and code from randomized controlled trials in the J-PAL Dataverse, its free data repository. J-PAL collects data on downloaders with the purpose of understanding their progress on their goals of democratizing data access and encouraging research transparency and replicability. In order to increase the number of downloads and encourage data publication, we want to understand what makes a dataset popular among downloaders. To do so, we compare the distribution of characteristics of popular datasets with those of all datasets. These characteristics include the income level of the downloaders’ countries, general position of the downloaders, intended use of the datasets, and the sector and region of the research projects. We found that there are no major differences among popular and non-popular datasets. For both, health is the most prominent sector, while South Asia is the region of the projects with more downloads. Likewise, the downloader profiles of popular and non-popular datasets are similar: most datasets in each group are downloaded by graduate students and residents of high-income countries, and are intended for exploratory and replication purposes. These results rule out these features as drivers of popularity, but better data collection and more analysis, such as on dataset documentation, are needed to determine the reasons for download popularity. Understanding this can improve J-PAL’s goal of making research data from randomized evaluations in the social sciences widely available and accessible.
Presented by
Ximena Mercado Garcia <ximena@mit.edu>
Institution
The University of Texas at Austin

Back to top

Can Interactions Between PeV Cosmic Rays and Local Galactic Molecular Clouds Explain the Positron Excess?

Arianna Colón Cesaní 1, Field Rogers 2, Kerstin Perez 2

Abstract
Cosmic rays (CRs) are energetic particles that traverse the galaxy and can provide information about different astrophysical phenomena. Rare antimatter particles, including positrons, are particularly interesting components of CRs because they are sensitive to the dynamics of cosmic ray propagation in the galaxy and could be probes of dark matter. An excess of positrons, relative to galactic CR models, has been observed at high energies by several observatories. These measurements disagree with conventional models of CR propagation and the production of secondary CRs, implying a source of positrons that is not included in the models. Moreover, leptons cannot propagate long distances in the galaxy without losing too much energy. As a result, sources close to Earth are necessary to explain the rise in CR positron flux. This project proposes a model to determine if the interactions of PeV-scale CR protons, which conventional CR propagation models do not consider, with local galactic molecular clouds (GMCs) could explain the observed excess near Earth. In this work, we use new catalogues of local GMCs, alongside positron flux data from AMS-02 and PeV-scale CR data from the IceTop detector. We calculate that the total positron rate from nearby GMCs is three orders of magnitude smaller than the positron excess, thus, our proposed mechanism cannot explain the excess.
Presented by
Arianna Colón Cesaní
Institution
1. Department of Physics, University of Puerto Rico-Mayagüez, 2. Department of Physics, Massachusetts Institute of Technology

Catalytic Living Ring-Opening Metathesis Polymerization

Ekua Beneman

Abstract
The ability to deliver drugs for cancer treatment with improved efficacy and decreased toxicity is vital to treating new diseases and enhancing the utility of developed drugs. Specifically, the Johnson group has shown that bottlebrush polymer architecture pro-drugs are an effective platform for drug delivery because of their synthetic modularity through Ring-opening metathesis polymerization (ROMP). ROMP enables the synthesis of bottlebrush polymers while achieving near quantitative conversion. However, ROMP requires stoichiometric use of a Ruthenium catalyst which results in relatively high levels of metal in the final products. This residual metal content of the bottlebrush drug delivery materials made through ROMP prevents their use in a wider variety of applications where higher or more frequent doses of material are required. Catalytic living ROMP has been proposed as a method to make these types of polymers using 100-1000x less catalyst and lower metal loadings using a chain transfer agent (CTA). The application of catalytic living ROMP to these bottlebrush architectures for drug delivery would thus be advantageous. Towards this end, we are exploring new routes to synthesize CTAs to conduct catalytic living ROMP.
Presented by
Ekua Beneman <ebeneman@mit.edu>
Institution
Spelman College, Department of Chemistry and Biochemistry

Changing Coastal Flooding for At-Risk Communities in the Developing World

Ann-Marsha Alexis, Dara Entekhabi, Miho Mazereeuw, Danielle Wood

Abstract
In light of climate change, flooding disproportionately affects hundreds of millions of people in coastal, low elevation urban areas. Moreover, it is often the case that those affected are already facing inequalities due to their specific identity, class, income level, or occupation. Flooding is exacerbated by high tides associated with sea-level rise and the incidence of storms. However, there is a current lack of research that considers these environmental variables in conjunction. We assessed Maputo, Mozambique due to its status as a post-colonial state and its sensitivity to flooding. We aspire to expand our research to other similarly situated regions. To provide policy-makers with needed and relevant information, we are examining the effects of flooding risks from a systems thinking viewpoint encompassed by the EVDT (Environment-Vulnerability-Decision-Technology) framework developed by Space Enabled at the MIT Media Lab. In addition to modeling the environment, the framework allows us to examine the human vulnerabilities to flooding, environmentally relevant decision-making and technology production or transfer. For the environmental model, we are using harmonic modeling to study tides in Maputo in order to predict the occurrences of “sunny day” flooding. This multi-step analysis also includes sea-level rise projections, probabilistic storm generation, and recently available space-enabled measurements (German Space Agency’s Tandem-X and NASA’s IceSat2) to develop high-resolution elevation and risk assessment maps. We are partnering with relevant scientists and community members for more complete data and expect to collaboratively develop coastal flooding hazard mitigation strategies.
Presented by
Ann-Marsha Alexis
Institution
Department of Earth, Atmospheric and Planetary Sciences, School of Architecture + Planning, MIT Media Lab

Comparison of STARsolo and DropSeq Tools for single cell RNA-sequencing Alignment of Pancreatic Cancer Organoids

Florencia Burian, Benjamin Mead, Conner Kummerlowe and Alex Shalek

Abstract
Pancreatic ductal adenocarcinoma, also known as PDAC or pancreatic cancer, is a rare and difficult disease to understand and treat. PDAC tumors are poorly described by mutational profiling, and therefore often characterized by phenotypic behavior, such as basal or classical tumors with basal being the most aggressive type. Drug screening is available to screen a number of different drugs that change a cell’s behavior and can be used in cell cultured models to understand PDAC vulnerabilities. These perturbations in cell behavior enables screening of gene expressions of these cell behaviors and can be examined through single cell RNA sequencing (scRNA-seq). Traditional scRNA-seq methods are costly to scale, making impractical, but the development of new technologies with scRNA-seq, such as compressed screening, provides high throughput, high quality data for low-input samples at low cost, making screening possible and more effective. As a result, an equally efficient scRNA data analysis workflow is needed. We used and analyzed two different sequencing alignment tools—STARsolo and DropSeq Tools—looking at the number of cells detected, cell quality and gene expression of these cells by each aligner. We observed that STARsolo gave much greater cell recovery than DropSeq Tools but found that DropSeq detected more cells in arrays with smaller total of recovered cells. Based on this preliminary analysis, STARsolo performed better than DropSeq in arrays with greater sample sizes. However, there were biases in the genes STARsolo detected compared to the genes DropSeq Tools detected, and different gene expression patterns shown for each aligner. In our next steps, we aim to investigate if these biases extend to all samples or are only context specific. Understanding what genes are responsible for PDAC’s aggressive behavior will enable us to get the most accurate and useful data to learn the fundamental nature of PDAC.
Presented by
Florencia Burian <burianf@mit.edu>
Institution
Kean University, NJ Center for Science, Technology and Mathematics

Estimating Changes in Ice-Flow in Byrd Glacier from Observational Data

Florencia Corbo-Ferreira, Meghana Ranganathan, Brent Minchew

Abstract
Climate change has affected all aspects of human life and touched all parts of the planet. One of its most prevalent impacts has been that on Earth’s ice. Fast-flowing glaciers in Antarctica eject significant mass of ice to the ocean, ultimately contributing to global sea-level rise. The behavior of these glaciers in a warming climate is the greatest source of uncertainty to our projections of sea-level rise. Therefore, constraining how glaciers will change is of the utmost concern for mitigation of climate change. The greatest source of uncertainty in our sea-level predictions is the result of what happens to Antarctica. An increased understanding of the ways Antarctica’s ice flows will contribute to climate knowledge. Generally, ice flow is described in ice-flow models through Glen’s Flow law, which defines the relationship between stress and strain using a stress exponent n. Here, I estimate the evolution of the stress exponent n in Glen’s Flow Law in Byrd Glacier as the glacier flows toward the ocean. I used LANDSAT-8 velocity data to create velocity profiles across the ice stream. I then used an ice flow model to fit values of n to each profile. It was found that the value of the stress exponent n increases as the point of observation approaches the ocean. Incorporating these spatially-varying values of the stress exponent n will reduce uncertainties in ice-flow projects and contribute to a greater understanding of glacier dynamics.
Presented by
Florencia Corbo-Ferreira
Institution
University of Florida, Department of Environmental Engineering Sciences

Functionalized Carbon Nanotubes for the Asymmetric Electrochemical Epoxidation of Olefins

Cindy Serena Ngompe Massado, Shao-Xiong Lennon Luo and Timothy M. Swager

Abstract
Chiral epoxides serve as reagents for the synthesis of enantiomerically pure, biologically useful molecules. The synthesis of chiral epoxides is therefore crucial to the production of pharmaceutical compounds. The production of epoxides on a large scale often uses a heterogeneous system with peroxide-based oxidants such as hydrogen peroxide or alkyl peroxides which require extreme and toxic conditions. Homogeneous systems for obtaining enantio-enriched epoxides have involved asymmetric catalysis by metal complexes with chiral organic ligands. These have displayed great selectivity and yield but poor scalability. An ideal system would merge the benefits of these systems for more efficient large-scale synthesis. In this study, we present an electrochemical system utilizing carbon nanotubes CNTs) functionalized with chiral ligands chelated to metal catalysts for the asymmetric epoxidation of olefins with water as an oxygen source. Here we show the optimization of synthesis conditions for the mono-oxazole and bis-oxazole chiral ligand precursors. This was confirmed by NMR and MS results. These were then used for the functionalization of MWCNTs. Functionalization by the bisoxazoline precursor was evaluated by XPS to yield 2.2 functional groups per 100 CNT carbons. By continuing this work, we aim to address the pitfalls of both heterogeneous and homogeneous systems. This system is expected to operate at room temperature and produce benign side products, making the system safer than the industry standard, while maintaining efficiency due to enhanced electrochemical electron transfer by the functionalized CNTs. These results would present new opportunities in enantioselective epoxidation and possibly dihydroxylation, as well as another asymmetric catalysis more broadly.
Presented by
Cindy Serena Ngompe Massado
Institution
Case Western Reserve University

How black holes swallow stars: analyzing the late X-ray emission of a tidal disruption event

Isabella A. Guilherme, Megan Masterson and Erin Kara

Abstract
When a star gets sufficiently close to a supermassive black hole (SMBH), it is torn apart by the SMBH’s tidal forces. Known as “tidal disruption event”, this can result in a luminous flare of radiation that allows us to detect dormant SMBHs and understand general features of accretion. However, the current canonical model for tidal disruption events (TDEs) does not satisfactorily explain the delayed X-ray emission observed in some UV/optically detected TDEs. Here we analyzed the spectrum and time evolution in X-rays of AT2019azh, a UV/optically detected source that shows a rise in X-ray flux ~250 days after the discovery. We collected and reduced data obtained by XMM-Newton, NICER and Swift, then modeled the spectra and generated light curves. The spectra are well described by a disk blackbody model, and in the highest quality XMM-Newton spectra, we also observe a high-energy tail, potentially associated with Comptonization in a hotter plasma. The disk luminosity scales with temperature as per the Stefan-Boltmann law, which is consistent with thermal emission from the accretion disk. It is still necessary, however, to further investigate the additional high-energy tail. Doing so will refine our understanding of TDEs and other complex accreting systems, such as active galactic nuclei.
Presented by
Isabella A. Guilherme <iag2124@mit.edu>
Institution
Columbia University, Massachusetts Institute of Technology

How do Black Holes Form? Developing a Statistical Tool to Analyze Gravitational Waves with Future Detectors

Divjyot Singh, Kwan-Yeung Ng, Salvatore Vitale

Abstract
Black holes (BHs) allow us to probe and understand the extremes of the universe. The collision of two BHs as a Binary Black Hole merger (BBH) creates detectable gravitational waves (GWs), which are disturbances in the space-time continuum. Detectors like LIGO allow us to observe these GW signals, which can provide information about the properties of BHs like mass and formation channels. Future generation GW detectors will dramatically increase the range of BBH data we receive, providing access to the early universe. In this project, we simulated BBH data from different formation channels- population III stars (PopIII) and primordial black holes (PBHs)- using population synthesis analysis. We then created two statistical models based on Bayesian data analysis: one with PopIII and PBH and one with only PopIII. We calculated the likelihood of both the models on the simulated data and observed a higher likelihood for PopIII and PBH. Hence, we concluded that this statistical technique can successfully identify the presence of PBHs in BBH data. This project provides an alternate method to explore the existence of PBHs.
Presented by
Divjyot Singh
Institution
New Jersey Institute of Technology

Implementation of U(1) Group Symmetry on Energy Flow Networks

Pedro Rivera-Cardona, Jesse Thaler, Rikab Gambhir

Abstract
The Large Hadron Collider (LHC) is a high-energy particle accelerator that produces multitudes of particles per collision. Machine learning techniques are used on LHC data to understand particle distributions and obtain insight into how experimental measurements relate to theoretical frameworks. Energy Flow Networks (EFNs) study the unordered, variable-length sets of particles from collision events. This architecture is used to analyze and learn from collider events and other particle physics phenomena. EFNs parametrize infrared- and collinear- safe observables by a learnable per-particle function Φ and latent space function F. Previously, EFNs have been used to study single jets at a time. We present an extension to the EFNs architecture with U(1) cylindrical symmetry. U(1) cylindrical symmetry allows for full event analysis with manifest periodicity. This was achieved by implementing a new initial layer Φ_0, which avoids altering the dataset. Φ_0 imposes a coordinate transformation to the azimuthal angle coordinate ϕ, manifesting periodicity. Finally, the extension will enable a Fourier decomposition of events to cosines and sines. Fourier decomposition will allow us to extract information on an understandable basis. After the implementation, we deploy CMS Open Data for further analysis, such as quark/gluon discrimination and top jet tagging.
Presented by
Pedro Rivera Cardona <priverac@mit.edu>
Institution
University of Puerto Rico, Mayagüez

Latent Space Modeling of Heterochromatin Protein 1 (HP1) Evolution

Abigail M. Adams, Xingcheng Lin, Bin Zhang

Abstract
The heterochromatin protein 1 (HP1) protein family is evolutionarily conserved and plays a critical role in heterochromatin formation, gene silencing, and other essential cell functions like DNA replication and repair, telomere maintenance, chromosome segregation, and positive regulation of gene expression. Most eukaryotes contain several HP1 isoforms, and mammals have three primary isoforms: HP1α, HP1β, and HP1γ. All HP1 isoforms include an N-terminal chromatin organization modifier (‘chromo’) domain and a closely related C-terminal chromoshadow domain, linked by an intrinsically disordered hinge region. While all isoforms possess the characteristic HP1 architecture, they each fulfill different functional roles. Here, we use a latent space generative model trained with variational autoencoders to infer the evolution of 1,048 sequences with architectures similar to HP1. The distribution of these sequences projected onto the trained latent space represents their evolutionary relationships, with ancestral sequences located near the origin and phylogenetically close sequences in close spatial proximity in latent space. Using K-means clustering of the latent space representation, we identify four distinct clusters. Among them, three clusters are represented primarily by one HP1 isoform, and the fourth cluster is largely absent of known HP1. Moreover, our analysis of the sequences in each cluster shows different degrees of amino acid conservation. Notably, the disordered hinge region, which is believed to play a central role in the structural dynamics of HP1 proteins, shows the least amount of sequence conservation, consistent with previous studies. Our clustering analysis based on the latent space generated from HP1-like sequences elucidates structural and functional relationships between HP1 isoforms. Further analysis is needed to determine what characteristics contribute to the clustered sequences in the latent space.
Presented by
Abigail Adams
Institution
Massachusetts Institute of Technology, Department of Chemistry

The Effects of Inflammation on the Brain Microvascular Endothelium

Noah Okada, Héctor Cantú Bueno, Gemma Molins, Mercedes Balcells-Camps

Abstract
There is an increasing amount of evidence correlating vascular dysfunctions with the development of Alzheimer’s disease, dementia, and other neurodegenerative disorders. Research suggests that cerebrovascular dysfunctions lead to disruptions of neurovascular structures such as the blood-brain barrier (BBB) which contribute to the disease progression of these disorders. Understanding the factors that contribute to the breakdown of the BBB will help to facilitate the diagnosis and treatment of these conditions. Our research sought to establish an in-vitro model of the BBB using human brain microvascular endothelial cells (HBMVEC) to understand how inflammation can contribute to the disruption of the BBB. This study examined the effects of inflammation and hypoxia on HBMVECs cultured in static conditions. By culturing HBMVECs in the presence of CoCl2 and tumor necrosis factor-alpha (TNF-𝝰) we collected detailed data on cell growth, viability, and protein expression under varying conditions of biological stress. Understanding the effect of inflammation under these conditions provides a unique and ecologically valid model of the compounded effects of stress on the BBB. These findings will pave a path for future research to understand how the onset and progression of neurodegenerative disorders can be influenced by chronic inflammation in the brain.
Presented by
Noah Okada
Institution
Department of Neuroscience and Behavioral Biology, Emory University; Department of Computer Science, Emory University;

The Synthesis and Characterization of N-Heterocyclic Iminophosphazenes (NHIPz), a Novel Subclass of Superbases

Brian Valladares, Keita Tanaka, and Christopher C. Cummins

Abstract
By definition, superbases have a basicity greater than a Proton-Sponge or a pKBH+ greater than 18.6 in acetonitrile. Superbases are known for their ability to access pronucleophiles with high pKas, which drives base catalyzed reactions that normally would not be favorable. They commonly suffer from side reactions due to their high reactivities. Recently, designer superbases have enabled highly enantioselective reactions, offering wide applications in the drug discovery process. This work endeavors to introduce a novel subclass of superbases, N-Heterocyclic Iminophosphazene (NHIPz). This synthesis was achieved by the key synthetic step of a 1,3-dipolar cycloaddition of alkynes with azophosphine (ArN=N–P(NMe2)2). In terms of structure, amino substituents on the phosphorus of the NHIPz parallel well-known phosphazene superbases. The cyclic nature of NHIPz provides an added rigidity that has not been previously reported, while improving the selectivity of the superbase in reaction development. Future works aspire to apply NHIPz to known superbase-mediated or catalyzed reactions and to compare their reactivities and selectivities.
Presented by
Brian Valladares <brianv@mit.edu>
Institution
Williams College, Department of Chemistry

Tunable Polymer-MOC Properties via RAFT Polymerization

Taylor A. Talley, Matthew A. Pearson, Jeremiah A. Johnson

Abstract
Metal-organic cages (MOCs) are discrete coordination cages composed of metal ions which are connected by organic ligands. The modularity and inherent porosity of MOCs make them attractive materials for applications in gas storage and separation, though they often suffer from a lack of processability. We, therefore, sought to develop a hybrid polymer-MOC (polyMOC) material with accessible permanent porosity that could enable more favorable mechanical properties. We utilized a Radical Addition-Fragmentation Transfer (RAFT) polymerization due to its ability to control the molecular weight and dispersity of chains and allow the synthesis of block copolymers and polymers with more complex architectures. We developed a procedure using the RAFT polymer ligand, free isophthalic acid, and Cu(NO3)2·2.5H2O to form novel polymer-Cu24(m-bdc)24 cage hybrids. Using a series of ratios of free ligand to polymer-bound ligand we determined the crystallinity of the material could be tuned, based on PXRD. 1H NMR of digested materials and IR spectroscopy further confirmed the successful incorporation of the polymer ligand in the material and that polymer incorporation varied by the initial stoichiometric ratios. This research lays a foundation for the development of bulk copper and rhodium polyMOC materials, including the addition of multiple polymer blocks for processable polyMOC materials.
Presented by
Taylor Talley
Institution
Spelman College, Department of Chemistry and Biochemistry

Understanding X-Ray Cavities

Hurum Maksora Tohfa, Michael McDonald

Abstract
X-ray observations have shown that the wholesale cooling of the universe is being offset by mechanical heating from active galactic nuclei(AGN). Feedback and heating from AGN are considered a prime candidate for solving the “cooling flow” problem in the hot gas of galaxy clusters. Recent observations using Chandra telescope has produced detection of X-ray surface brightness depressions known as “cavities” or “bubbles” in many of these systems, interpreted as buoyantly rising bubbles created by AGN outbursts. Studies of such cavities in clusters suggest that the outburst energy required to inflate these cavities would be sufficient to balance cooling. The pressure from these cavities have a one-to-one correlation with the luminosity of the clusters. We have created simulations by imagining bubbles by varying their radii and distance from the AGN center using different theoretical models and experimental data of X-ray cavities. We then looked at their pressure and corresponding luminosity of the clusters and found a similar correlation which implies that correlation is rather a property of the gas cluster. So, the objects that we observe as cavities might be just numerical noise in the telescope data as the pressure-luminosity one-to-one correlation is not a cavity specific property. We have also observed that scatter is dependent on the bubble geometry. From previous experimental observations we have also noticed correlation between radii and distance from AGN center. We have imposed theoretical constraints to explain this phenomena as well.
Presented by
Hurum Maksora Tohfa
Institution
Bryn Mawr College, MIT Kavli Institute for Astrophysics and Space Research

Back to top

Cover vs Frank-Wolfe: Finding the Log-Optimal Portfolio

Mariam Alaverdian, Robert M. Freund

Abstract
The goal of any financial portfolio is to maximize the returns on the assets it contains. In order to maximize returns, it is necessary to find the proportion of each asset within the portfolio. We can do so by optimizing a logarithmic objective function that represents our portfolio returns. Linear problems are relatively simple to solve, but algorithms that solve non-linear problems can often take substantial time to find an optimal solution. Therefore, it is important to compare multiple algorithms and their variations with one another to determine which one is the most efficient. Dr. Thomas M. Cover has created an algorithm that can handle the optimization of the logarithmic function of the return of the portfolio. This technique finds a log-optimal portfolio by maximizing the expected log return over the simplex of proportions of the assets that are non-negative and that sum up to 1. The Frank–Wolfe algorithm, on the other hand, considers the linear approximation of the objective function with the same domain, and in each iteration, it moves closer to the minimizer of the linear function. Here, I compare the performance of Cover’s method to the Frank-Wolfe method, both with and without using a binary-search line-search technique for step-size selection. My project used a data set that contains 9 years of data on asset within the portfolio, from which, I calculated the fraction of return per asset. In order to evaluate the performance of the two algorithms, I looked at the time required by each method to find the optimal solution of a given objective function. To avoid complications that could arise from the memory and processing speed of the computer or program itself, I also looked at the number of iterations each algorithm took to find the optimal solution. Thus far, I have successfully encoded Frank-Wolfe’s algorithm with and without binary-search line -search, and Cover’s algorithm without binary-search line-search. Once the comparison is done, I will be able to draw a conclusion and understand which model is more efficient for a log-portfolio problem. This framework can also be extended for work ranging from medical image construction to matrix completion problems.
Presented by
Mariam Alaverdian <amariam@mit.edu>
Institution
Department of Applied Mathematics, Yale University

The Failure of Government to target Hispanics through Online COVID-19 vaccination campaigns

Valerie Aguilar Dellisanti, Catherine Tucker

Abstract
The Latino community has been disproportionately affected by the COVID-19 pandemic, with higher contraction and death rates than their non-Hispanic white counterparts. This study measures the role of social media on addressing Hispanics across the United States for government-sponsored immunization campaigns. We used data from Facebook’s Ads Library, analyzing publicity by government agencies, emphasizing the Spanish-to-English ad ratios and comparing it to the Hispanic and Spanish speaking population in different states. The time frame was between February 1 and July 15 of this year. Our results show that out of all the active state-sponsored vaccination Facebook ads, only 6.7% are in Spanish. Additionally, four out of the five states with the largest Hispanic population -California, Texas, Florida, New York and Arizona- have concerning proportions when measuring the Spanish-to-English ad ratio. It is our hope that through this project, we can raise awareness and highlight the disparities affecting the Hispanic population regarding immunization rates in the United States.
Presented by
Valerie Aguilar Dellisanti
Institution
Brown University