%{search_type} search results

218,115 catalog results

RSS feed for this result
Collection
Undergraduate Theses, School of Engineering
Heart disease is the world’s leading cause of mortality, accounting for over 7.4 million deaths each year. Consequently, there has been immense interest in the medical and scientific community to develop technologies to help regenerate injured hearts. 3D printing of biological materials (i.e., 3D bioprinting) is an innovative technology that may be applied to help us understand and treat cardiac disease. In my thesis work, I utilized 3D bioprinting to create a viable and functional artificial cardiac construct. Human umbilical vein endothelial cells (HUVECs) and human induced pluripotent stem cell-derived cardiomyocytes (hiPSC-CMs) were encapsulated in 10% gelatin methacrylate (GelMA) hydrogels and subsequently bioprinted into 3D tissue constructs. Analysis of the printed tissues over time demonstrated cell viability as assessed by their metabolic activity in culture. Cell migration and proliferation were observed in the printed HUVEC tissues 7 days post printing and beating was observed in the hiPSC-CM tissues 3 days post printing. In addition, I demonstrated the endothelialization of acellular printed scaffolds using a perfusion bioreactor system. In the future, I plan to utilize the 3D tissue printing and bioreactor platform for in vitro applications such as personalized drug cardiotoxicity screening and in vivo applications such as generating a cardiac patch for transplantation into an animal model of myocardial infarction.
Collection
Undergraduate Theses, School of Engineering
The medial prefrontal cortex (mPFC) is important in guiding learned, reward-seeking behavior through the integration of relevant information from multiple brain regions. Multiple studies have used the head-fixed Go/No-Go task to assess mPFC function in decision-making because rodents learn the task quickly and it is compatible with several imaging strategies. However, this paradigm has two inherent flaws: 1) head-fixation does not emulate natural behavior, and 2) the motivational state of the animals is ambiguous because they are forced to perform task trials at an experimenter-defined schedule. To assess the function of mPFC in decision making in a more naturalistic context, I have designed and successfully built a behavioral system that robustly reads out a two-alternative-forced choice (2AFC) behavior in freely moving mice. First, this behavioral system allows for freely moving mice to perform learned choice behavior in a custom-designed behavioral box that is dimensioned to mimic its home cage. Second, mice are able to self-initiate task trials without external constraints placed upon them, which is made possible by motion sensing hardware. This behavioral system is also built to be flexible, allowing for easy control of task parameters and integration of external devices such as imaging cameras. The construction of this system combines sensory systems, MATLAB software, and the Arduino platform. By using this behavioral set up, I have effectively trained mice in the 2AFC task, and have recently begun to perform in vivo Ca2+ imaging using the Inscopix mini-endoscopic microscopy system. By correlating the real-time activity patterns of genetically/anatomically-defined mPFC neurons with freely moving choice behavior, I have found cell type specific activity that correlates with specific outcomes of 2AFC.
Collection
Undergraduate Honors Theses, Graduate School of Education
Despite the emergence of formal “critical whiteness studies” in academia over the last twenty years, there currently remains a dearth of research on the teaching of whiteness studies. Additionally, there remains a lack of formal engagement with the topics of race and whiteness, especially by white students in universities across the United States. This paper investigates the potential for implementing anti-racist critical whiteness studies at Stanford University, seeking to understand how the field of whiteness studies is taught on college campuses and what Stanford students think about studying race and whiteness. The paper analyzes results from a survey of 200 Stanford students that aimed to collect student reactions to studying race and whiteness, as well as students’ impressions of their own qualification and comfort in discussing race with friends and family. Results indicate a strong desire among the Stanford student body to study race, and provide an argument for Stanford University to establish critical whiteness studies within a framework of anti-racism education. The paper proceeds to analyze eighteen syllabi for classes on whiteness, concluding that while there are many ways to teach about whiteness, there is a distinct and common approach which uses high-level concepts like race as a social construction with intersectional implications and a universalized understanding of whiteness to promote the visibility of whiteness to students in the class. There remain incredible tensions around analyzing racism without centering or privileging whiteness. However, taken together, Stanford student’s endorsement of race and whiteness studies and the sample syllabi’s formal structure for teaching about anti-racist critical whiteness provide a case for Stanford University to implement such a curriculum.
Collection
Center for International Security and Cooperation (CISAC) Interschool Honors Program in International Security Studies
Existing literature has attempted to define the Shanghai Cooperation Organization and determine its role in both Chinese and Russian foreign policy. Unfortunately, misunderstandings of the SCO as a security institution mean that efforts to situate the SCO within foreign policies are mistaken. Some observers have described the SCO as an alliance against the United States and other external forces. Others consider the SCO to be a security management institution balancing Chinese and Russian interests in Central Asia. In reality, the SCO does not fit neatly into any categories. Over the course of its history first as the Shanghai Five mechanism and then as the Shanghai Cooperation Organization, the developing institution has attempted to perform the functions of both an alliance and a security management institution. However, the lack of institutionalization, whether purposeful or due to competition between member-states, has prevented the SCO from developing into either option. At the same time, Russia’s obstruction of Chinese interests through the SCO have led China to pursue the Belt and Road Initiative as an unconstrained exercise of its power. This raises the possibility of an increasingly competitive, zero-sum “New Great Game” over Central Asia between China and Russia. The relative power disparity between China and Russia puts this competition in Beijing’s favor, but the exercise of power unrestricted by legitimate institutions may well provoke opposition from regional partners. A stable regional order that preserves China’s place in the sun will ultimately require Beijing to accept limited constraints and create legitimate institutions.
Collection
Masters Theses in Journalism, Department of Communication, Stanford University
Story about how arts in rehabilitative programs can change the lives of juveniles
Collection
Undergraduate Honors Theses, Graduate School of Education
This paper examines the impact of an elementary charter school that targets low-income, black and Hispanic boys in Florida. Using student-level demographic data, charter school students were matched with traditional public school students that shared the same demographic characteristics and were within close range of the starting test score in third grade. The results showed the following impact on the difference in scores from third to fourth grade: for math, attending the charter school is associated with testing .72 standard deviations lower than students in the traditional public school; for reading, the charter school is associated with testing .19 standard deviations lower on the test. Both of these results are statistically significant. This model can be used in different scenarios in order to understand where certain demographics of students are succeeding, and can thus lead to further causal research to identify and disseminate success tactics and pedagogies throughout the school system.
Book
1 online resource.
Metasurface optics have been developed for a wide variety of optical applications that compete with as well as go beyond the functionality of conventional optics. One upcoming challenge for future metasurfaces is the development of individually addressable optical properties that can be actively changed with controllable input stimuli. In this work, we will explore how electro-mechanics can be used to couple a mechanical degree of freedom to an optical response. We will see how the localized optical resonances within a silicon nanowire placed near a mirror can be leveraged as a building block for engineering an active metasurface. The tuning of the amplitude of scattered light will enable color-tunable active metasurfaces. By designing a structure with a modifiable phase gradient, both light steering and focusing can be controlled by an applied electrical signal. In addition to covering mechanically controlled active metasurfaces, we will also show that it is possible to use passive metasurfaces to gain real-time three dimensional information in a volume and that in certain cases, active metasurface optics are not the only solution to improve optical functionality.
Book
1 online resource.
Construction field managers often struggle to keep projects on schedule, resulting in time and cost overruns. Schedule conformance depends on the activities starting and finishing on time. However, activities are often delayed because the flows necessary to start their execution are unavailable. These flows can be classified into seven types: labor, equipment, workspace, materials, precedence, information, and external flows (Koskela 1999). I tracked a total of 5,843 flows in this research, all of which fell into one of these seven categories. Flows released from upstream activities become inputs to downstream activities. Therefore, delays in upstream activities hinder the timely release of flows, which can cause delays in downstream activities depending on those flows. To manage the flows, field managers need to know the flows' source, their status, and their readiness likelihood. Current construction models do not formally represent, measure, and track all the flow types. Hence, field managers lack formal methods for tracking the flows' status and estimating their readiness likelihood. Instead, they rely on their intuition and experience to manage the flows. This dissertation presents an activity and flow-based construction model, called the Activity-Flow Model (AFM). The AFM allows field managers to proactively manage the on-site work by allowing them to formally represent, measure, and track the construction activities and flows. The AFM consists of an ontology that defines the representation of the activities, the flows, and their interactions; the planning and control methods that enable the AFM's implementation on site; and the predictive models that help anticipate variations in downstream activities. The AFM was developed based on literature, field observations, and feedback from field managers. The AFM was validated prospectively for a total of 26 weeks through its implementation on three building projects that were in different phases (foundations, core and shell, and finishing), locations (Bogota, Copenhagen, and Lima), and used different planning and control methods (master schedule and weekly planning, Last Planner System, and Location-based Management System). The AFM was able to represent all the activities (1,645) and flows (5,843) in the test projects, track their variations, and quantify their variability. The planning and control methods enabled field managers to proactively manage the projects taking both the activities and flows into account. The predictive models supported by the AFM allowed field managers to anticipate variations in downstream activities and outperformed the predictive models supported by the Resource-constrained Critical Path Method (RCPM) (Fondahl 1961) and Location-based Management System (LBMS) (Kenley and Seppänen 2009) representations. The field managers used the analytics of the activities' and flows' performance record to allocate resources, size buffers, and modify the look-ahead schedule. Hence, the AFM can help field managers improve flow readiness, reduce activity delays, and improve schedule conformance.
Book
1 online resource.
Additive manufacturing (AM) is the automated layer-wise fabrication of 3D objects directly from geometrical computer models. In the music technology lab, AM affords the rapid testing of enclosures and components for both sound capturing and sound producing instruments. These instruments, typically outsourced to manufacturing hubs, are now produced near the site of musical experimentation through desktop fabrication. Parametric modeling and rapid fabrication with AM accelerate the design cycle for the production of instruments for music research. AM alters the site of manufacturing, the duration between the digital sketch and its materialization, and the material constitution of the instruments we might deploy; these shifts have consequences for what gets made and in turn alter artistic practices that integrate such tool-making tools. To illustrate these affordances of AM, I describe a suite of instruments within a taxonomy of usage categories that move from reproduction of known instruments, to the augmentation of found ones and finally a phase of invention. Invention arises not simply from putting such machines in the service of novel ideas but rather from carefully examining the material outcomes of the layering process itself. The term AM is used in place of numerous alternatives to emphasize this anisotropic grain of the printed object. Although the additive method announces a powerful flexibility in the shapes it sheds, its performance in various acoustic and electroacoustic scenarios depends in part on this grain. The practice, therefore, navigates trade-offs between malleable fabrication methods for organizing material and the quality of the resulting forms for organizing sound.
Book
1 online resource.
Ongoing advances in DNA sequencing technologies have undoubtedly shifted genomics into the realm of big data. To cope with this data explosion and enable rapid advances in biology and medicine, we must develop scalable and efficient methods for genomic data analysis that can leverage both domain specific knowledge, as well as the latest computing platforms, such as multi-core and accelerator-based architectures and cloud computing. This thesis presents several algorithms developed with this goal in mind, focusing on key tasks performed in resequencing studies, metagenomics, and cancer genomics. First, we consider the ubiquitous and compute-intensive task of read mapping. We propose the read mapping algorithm BALAUR that can outsource a significant portion of the computation to the public cloud preserving the privacy of genomic data. BALAUR uses the MinHash technique and a coarse-grained voting scheme to map the reads, relying on the high degree of similarity between the reads and their best reference matches. BALAUR has a similar runtime to state-of-the-art read mappers in short read mapping and achieves significant speedups over existing approaches in long read mapping. We also propose the read mapping algorithm BWBBLE that maps reads to a large collection of genomes with the goal of eliminating reference bias and boosting mapping accuracy. BWBBLE creates a compressed linear representation of the collection, taking advantage of the high redundancy across the genomes. BWBBLE then uses this representation, along with an adaptation of the Burrows-Wheeler Transform search algorithm to efficiently map the reads (since all the reads can be treated independently, this is an embarrassingly parallel task). This results in significant reduction in space and runtime requirements, compared to mapping to each genome individually. Next, we consider several resource-intensive tasks in metagenomics. We introduce the framework GATTACA for fast and accurate metagenomic binning. GATTACA uses a lightweight approach for estimating co-abundance profiles across a cohort of metagenomic samples (subsequently used as features for binning), replacing read mapping with a kmer counting procedure. This approach can result in up to several orders of magnitude speedup in abundance estimation; it has also been parallelized for shared memory systems. Creating compact indices of metagenomic samples, GATTACA enables easy reuse across studies via fast downloads and small disk footprints. GATTACA also incorporates a fast procedure for metagenomic sample comparison based on MinHash that can be used for cohort selection. Finally, we focus on cancer genomics, specifically the critical task of inferring tumor heterogeneity and evolution. We introduce a novel algorithm, LICHeE, that uses variant frequencies from deep sequencing data across multiple tumor samples to infer the underlying cancer lineage tree and decompose the samples into subclonal populations. LICHeE is fast and efficiently scales with the number of input samples and variants, relying on a combinatorial formulation of the task as a search for spanning trees in a constraint network.
Book
1 online resource.
Biofilms are multicellular communities consisting of microorganisms enmeshed in an extracellular matrix of biopolymers. The matrix provides the community structure and cohesiveness and allows it to adhere to a variety of interfaces. Formation of a biofilm is advantageous to the microbial community, as it provides protection from external assaults (desiccation, oxidizing agents, predation), protection from host immune defenses, facilitates close cell-to-cell interactions for DNA exchange, and creates nutrient gradients that give rise to metabolic diversity within the community. These factors allow biofilms to persist in a variety of settings, ranging from large-scale industrial equipment to medical implants in the human host. In fact, many infections are now appreciated to be biofilm-related and are difficult to treat by traditional means such as antibiotics. To combat unwanted biofilms, a current strategy is to take a biophysical approach and interfere with the biofilm structure by disrupting the extracellular matrix. This strategy could revoke the survival advantages provided to the microorganisms by existing in the biofilm community. It also avoids the life-or-death pressure placed on microorganisms by traditional antibiotic treatment that gives rise to drug resistant mutations. However, to achieve this goal of targeting the extracellular matrix, we require an improved understanding of the underlying mechanical properties of the biofilm structure. In this work, we describe the use of modified rheological methods to quantify mechanical interactions relevant at all stages of the biofilm lifecycle, including: initial microbial adhesion to interfaces, maturation of the biofilm structure, and microbial dispersal. A Live Cell Monolayer Rheometer (LCMR) was used to study adhesion of uropathogenic Escherichia coli to bladder epithelial cells, the initial step in bladder infection. Quantitative mechanical measurements defined the contributions of bacterially produced type 1 pili, curli, and cellulose to bladder cell adhesion, and revealed an important role for cellulose in mediating these interactions. This novel use of live cell rheology can be expanded to study a variety of other relevant host-pathogen interactions. In a separate study, interfacial shear rheology was used to study the maturation of biofilms formed at the air-liquid interface by Vibrio cholerae, the causative agent of cholera. It was discovered that out of several known extracellular matrix components in the V. cholerae biofilm, a specific matrix protein called Bap1 contributed significantly to maintenance of biofilm elasticity, biofilm hydrophobicity, and development of a mature biofilm structure. Finally, mechanical measurements relevant to biofilm dispersal were performed using a custom-built device to apply large deformations to Bacillus subtilis biofilms formed at the air-liquid interface. These measurements revealed that biofilms exhibit non-uniform deformation due to inhomogeneous mechanical properties within the structure and can have both viscoelastic and viscoplastic characteristics. Together, these studies produced new tools in the field of biofilm mechanics and provided quantitative measurements of mechanical interactions relevant to all stages of the biofilm lifecycle.
Book
1 online resource.
We prove a version of Artin's criteria for representability of moduli functors in the setting of non-archimedean analytic geometry in characteristic zero, and deduce representability of the Picard functor under reasonable hypotheses.
Book
1 online resource.
In this work, we introduce and apply several new techniques for oil/gas reservoir optimization under uncertainty. As the first contribution, we develop a general methodology for optimal closed-loop field development (CLFD) under geological uncertainty. CLFD involves three major steps: optimizing the field development plan based on current geological knowledge, drilling new wells and collecting hard (well) data and production data, and updating multiple geological models based on all of the available data. In the optimization step, the number, type, locations and controls for new wells (and future controls for existing wells) are optimized using a hybrid Particle Swarm Optimization -- Mesh Adaptive Direct Search algorithm. The objective in the examples presented is to maximize expected (over multiple realizations) net present value (NPV) of the overall project. History matching is accomplished using an adjoint-gradient-based randomized maximum likelihood (RML) procedure. Different treatments are presented for history matching Gaussian and channelized models. Because the CLFD history matching component is fast relative to the optimization component, we generate a relatively large number of history matched models. Optimization is then performed using a representative subset of these realizations. We introduce a systematic optimization with sample validation (OSV) procedure, in which the number of realizations used for optimization is increased if a validation criterion is not satisfied. The CLFD methodology is applied to two- and three-dimensional example cases. Results show that the use of CLFD increases the NPV for the `true' (synthetic) model by 10% --70% relative to that achieved by optimizing over a large number of prior realizations. The CLFD framework includes several components, and different approaches for history matching, optimization, model selection and economic evaluation can be applied. In our second contribution, we address the problem of selecting a subset of representative geological realizations from a large set. Towards this goal, we introduce a general framework, based on clustering, for selecting a representative subset of realizations for use in simulations involving `new' sets of decision parameters. Prior to clustering, each realization is represented by a low-dimensional feature vector that contains a combination of permeability-based and flow-based quantities. Calculation of flow-based features requires the specification of a (base) flow problem and simulation over the full set of realizations. Permeability information is captured concisely through use of principal component analysis. By computing the difference between the flow response for the subset and the full set, we quantify the performance of various realization-selection methods. The impact of different weightings for flow and permeability information in the cluster-based selection procedure is assessed for a range of examples involving different types of decision parameters. These decision parameters are generated either randomly, in a manner that is consistent with the solutions proposed in global stochastic optimization procedures such as GA and PSO, or through perturbation around a base case, consistent with the solutions considered in pattern search optimization. We find that flow-based clustering is preferable for problems involving new well settings (e.g., time-varying well bottom-hole pressures) or small changes in well configuration, while both permeability-based and flow-based clustering provide similar results for (new) random multiwell configurations. We also investigate the use of efficient tracer-type simulations for obtaining flow-based features, and demonstrate that this treatment performs nearly as well as full-physics simulations for the cases considered. The various procedures are applied to select realizations for use in production optimization under uncertainty, which greatly accelerates the optimization computations. Optimization performance is shown to be consistent with the realization-selection results for cases involving new decision parameters. In the third contribution, we introduce a methodology for the joint optimization of economic project life and well controls. We present a nested formulation for this joint optimization problem where we maximize NPV, subject to the constraint that the rate of return of operations is greater than the minimum attractive rate of return (MARR) or hurdle rate. The methodology provides the optimal project life and the optimal well controls such that the maximum NPV is obtained at the end of the project life, and the rate of return of the project is essentially equal to MARR. Application of this procedure, enables avoiding situations where NPV increases slowly in time, but the benefit relative to the capital employed is extremely low. We demonstrate the successful application of this treatment for production optimization for two- and three-dimensional reservoir models.
Book
1 online resource.
Over 1.5 million knee MRI scans are performed in the US annually for reasons ranging from acute knee injuries to researching diseases such as osteoarthritis. MRI can offer high-resolution imaging with excellent soft-tissue contrast and is often a tool of choice of interrogating various pathologies in musculoskeletal system. More recently, there has been a large emphasis on utilizing MRI in order to generate imaging-based biomarkers to track spatial and longitudinal changes in tissues. Using such biomarkers has potential for understanding and describing the pathophysiology of complex diseases such as Osteoarthritis. Despite the excellent image quality and potential biomarkers of disease activity that MRI can generate, it is still challenging to acquire all these features in rapid MRI protocols. High-resolution sequences are usually needed to evaluate fine structures typically observed in musculoskeletal systems. High signal from tissues is usually needed to accurately characterize quantitative biomarkers. At the same time, there is a growing need for rapid imaging methods in order to maximize patient throughput while minimizing patient discomfort. In MRI however, signal, resolution, and imaging time are properties of scans that are challenging to optimize simultaneously. In this work, I will describe how the already-available double-echo steady-state (DESS) sequence was optimized to generate quantitative morphological and biochemical biomarkers for cartilage and meniscus, in only a 5-minute acquisition. Conventional imaging methods used in large clinical studies typically require around 25 minutes of imaging time to generate such biomarkers. The validity of these biomarkers was compared to time-consuming methods and the reliability was evaluated using repeated acquisitions. Both the accuracy and the precision of this rapid DESS method was high enough to be confidently used in clinical studies in order to be able to track small longitudinal changes. This thesis also describes how the same DESS sequence can also be used for routine clinical knee MRI. The contrasts and the resolution that the DESS sequence offers can be used to diagnose internal knee derangement and its corresponding signs in tissues such as the cartilage, meniscus, tendon, ligaments, bone, and synovium. In additional to the morphological image contrasts, this study also probed the utility of having automatic quantitative T2 measurements available during diagnostic review also. Accuracy of the 5-minute DESS method against that of the routine knee MRI protocol was compared, where the DESS results were very promising. This thesis also studied the diagnostic accuracy of pairing one sequence from the conventional imaging protocol with DESS, in order to create an abridged two-sequence protocol. Such protocols have great potential for being able to transition from 30+ minute clinical protocols to protocols that last 5-7 minutes. While the 5-minute DESS sequence was able to interrogate several musculoskeletal tissues, there are still several tissues known as short-T2 tissues that generate very minimal signal using conventional Cartesian MRI sequences. In order to be able to image such tissues and to be able to quantify some of their underlying properties, we developed the Ultrashort Echo Time DESS (UTEDESS) sequence. This method permits imaging with a very high signal to noise ratio. This high signal with UTEDESS can be used to perform morphological and quantitative imaging of the menisci, tendons, and ligaments - all tissues that are very challenging to image with routine sequences. UTEDESS also provides the ability to image with isotropic resolutions in short scan times so that the images can be retrospectively re-sampled in arbitrary planes in order to maximize the diagnostic efficiency of the method. Overall, all the advances described in this thesis have the potential to accelerate the current paradigm of musculoskeletal imaging methods while being able to generate additional data which could be used in both - diagnostic and research settings.
Book
1 online resource.
Mapping the complex structural connectivity of the human brain in vivo is essential for understanding healthy brain function and the fundamental basis of many neurological and psychiatric disorders. Diffusion-weighted magnetic resonance imaging (MRI) measures the diffusion pattern of water molecules to infer the underlying tissue microstructure. Coupled with fiber tracking techniques, diffusion-weighted MRI has become a widely utilized non-invasive method for mapping neuronal fiber pathways. Nonetheless, it is challenging to accurately model the diffusion pattern in tissue while a model-free approach requires a lengthy acquisition. Further, the reconstructed fiber model requires rigorous validation for clinical translation. This dissertation addresses these challenges in the course of three projects. First, a thorough analysis of the effects of q-space truncation and sampling on the water molecule displacement ensemble average propagator (EAP) in the model-free q-space imaging (QSI) framework is performed. This study clarifies guidelines for acquiring and reconstructing Cartesian QSI data such that aliasing is prevented in the EAP and and Gibbs ringing is minimized in the estimated fiber orientations. To increase QSI's applicability to different types of data, an intuitive and practical QSI reconstruction framework for obtaining the EAP and fiber orientations from multi-shell q-space samples is proposed. Finally, a retrospective study is conducted to assess the validity and efficacy of diffusion-weighted MRI fiber tracking-based targeting for transcranial MRI-guided focused ultrasound treatment of essential tremor. The studies presented in this dissertation advanced neuronal fiber mapping approaches for diffusion-weighted MRI.
Book
1 online resource.
This dissertation studies operational and policy-making problems that arise in agricultural supply chains. In the first two chapters, we explore the role of government interventions in poverty alleviation and promotion of sustainable production, respectively. In the third chapter, we investigate farmers' land allocation decisions in the face of a diversification option. Chapter 1 is motivated by the policy makers' efforts in alleviating poverty and maintaining food security by supporting poor farmers in developing countries. We investigate the effectiveness of three types of interventions, price support, cost support and yield enhancement efforts, as well as different policy implementation methods such as announcing the total budget or the unit support, in terms of their impact on farmers' incomes, consumer surplus, and return on government spending. We show that price and cost support interventions are equivalent if the total budget is public information. On the other hand, if the government announces the unit support, the benefit to different stakeholders along the agricultural supply chain depends on the market distortion created by the intervention. Specifically, in this case, price support results in greater distortion, benefiting consumers more than cost support whereas the converse holds for farmers. Furthermore, we find that under yield enhancing efforts, farmers may incur losses due to the interplay of several market and crop characteristics. Lastly, we show that interventions cannot always generate positive return from the government's perspective. Chapter 2 explores the role of policy instruments in promoting sustainable practices in agricultural production. We investigate the effectiveness of a number of policy instruments, i.e., taxes and subsidies, in terms of their impact on adoption of sustainable practices, producers' profits, consumer surplus and return on government spending. Our findings indicate that while using only taxes encourages the adoption of sustainable production, social welfare decreases as a result. Utilizing only subsidies outperforms policies that involve both taxes and subsidies in achieving higher social welfare but the converse is true in achieving a higher adoption rate. We show that zero-expenditure policies result in a decline in social welfare unless producers face financial constraints in making the costly transition to sustainable practices. Finally, we conduct a numerical study using data on conventional and organic egg production in Denmark and make policy recommendations in order to achieve the target adoption rate set by the Danish government. Chapter 3 investigates farmers' production decisions when facing different crop options. We investigate the value of crop diversification for farmers and the impact of this flexibility on the supply chain. Our results indicate that farmers' land allocation decisions depend on the total farm space (capacity) available and that diversification may not be the equilibrium outcome if the capacity is low. We find that as the profitability gap between the alternative crops increases, monoculture outweighs diversification for higher capacity values. When diversification is the equilibrium outcome, yield variability of both crops adversely affects farmers and the supply chain. On the other hand, buyers are better off when the alternative crop has high yield variability. In fact, buyers may benefit from an increase in the yield variability of the crop they buy if farmers are incentivized to limit the production of the alternative crop in order to extract the maximum revenue from the market.
Book
1 online resource.
Lung disease is the third leading cause of death in the United States and has an even higher fatality rate in countries with excessive pollution. Strikingly, pulmonary mechanics and airway obstruction remain drastically understudied. The airway is a living system, and its disease-driven adaptation induces remodeling of its geometry and material properties, resulting in airway occlusion. Utilizing computational simulations and experimental characterization of airway mechanical properties, this thesis seeks to confront clinically relevant questions pertaining to airway collapse in diseases such as asthma and bronchitis. The computational results - based on the theory of finite growth, solid mechanics, and nonlinear finite element analysis - rationalize medical observations and elucidate the complex phenomenon of airway obstruction. The complementary porcine tissue experiments address the pressing need for airway-specific material characterization to inform the biophysical response of the small bronchi, the predominant site of obstruction. This research will focus on highlighting the tightly connected, iterative computational-experimental nature of airway mechanics research to enable translational discoveries in the clinic through predictive modeling, advanced medical diagnostics, and optimized interventions in pulmonary healthcare.
Book
1 online resource.
The reentry blackout phenomenon affects most spacecraft entering a dense planetary atmosphere from space, due to a plasma layer that surrounds the spacecraft. This plasma layer is created by the ionization of ambient air due to shock and frictional heating created by the moving reentry vehicle, and, in some cases is further enhanced due to contamination by ablation products. The highly mobile electrons in the plasma cause a strong attenuation of incoming and outgoing electromagnetic waves, including those used for command and control, communication, and telemetry over a period referred to as the ``blackout period''. The blackout period may last up to several minutes, and at reentry speeds that may be of the order of 10 km/s, poses a serious safety hazard for the payload on board the spacecraft, especially for human spaceflight. In this work, we present a method for alleviation of reentry blackout using electric fields in a pulsed fashion. We study the reentry plasma's interaction with electronegative voltage pulses using computer simulations that incorporate models of the plasma's response to the applied electric field and interactions between the plasma sheath and the spacecraft surface. The simulations show how one can create pockets of depleted electron density in the reentry plasma sheath that may be used as ``communication windows'', thereby circumventing reentry blackout. Several parametric sweeps are also performed in order to design a blackout alleviation system. Finally, we present a discussion of experimental efforts to verify the simulation results and conclude with a conceptual design for a reentry communications blackout alleviation system based on the exclusive use of electric fields.
Collection
Undergraduate Theses, Department of Biology, 2016-2017
Illnesses that elicit an innate maternal immune activation (MIA) during pregnancy can expose the fetal brain to proinflammatory cues that diffuse across the placenta. Fetal exposure to MIA has been shown to disrupt normal neurodevelopment, and is linked to the progression of a variety of neurodevelopmental disorders, including autism, schizophrenia, cerebral palsy, and mental retardation. However, the diverse range of neurological abnormalities induced by MIA that underlie its role in altering neurodevelopment have yet to be identified. Some studies have suggested that MIA can induce abnormalities in the abundance and distribution of neuronal subtypes in the cortex, while others demonstrate systematic brain atrophy across many brain regions. These cellular changes could contribute to overall circuit dysfunction, perhaps contributing to the pathogenesis of disease. My interests are in the specific relationship between MIA and schizophrenia. MIA-induced changes in neuronal subtypes have been documented in the cerebral cortex, but there is no information available on the effects of MIA on neuronal subtypes within the striatum, an additional region thought to be critical in the etiology of schizophrenia. This study specifically investigates MIA’s effect on the abundance of a particular type of inhibitory interneuron that has been implicated in schizophrenia, parvalbumin-positive interneurons (PVIs), which play an important role in modulating striatal output by delivering inhibition to striatal medium spiny projection neurons (MSNs). As the selective activation of MSNs is thought to be critical to behavior selection and cognition, deficits in striatal PVIs could contribute to abnormal cognitive processing loops initiated in the striatum, perhaps contributing to dysregulation of behaviors implicated in schizophrenia. My study indicates that MIA does indeed result in a reduction in the density of PVIs in the striatum, perhaps analogous to the lowered PVI density observed in the cortex of schizophrenics. This suggests a role for MIA in striatal patterning that could contribute to dysregulation or pathology by altering the cellular composition of schizophrenia-related neural networks.
Book
1 online resource.
A very large number of practical optimization problems have been expressed as minimization of a convex objective function over a nonconvex set, specifically discrete sets such as the set of integer points. These problems arise in a variety of fields such as statistics and modeling, data analysis, and control. Two general approaches have been used for solving mixed integer optimization problems. One approach is to solve the problem globally using methods such as branch-and-bound, which suffer from exponential worst case time complexity. Another approach is to use heuristics such as semidefinite relaxation hierarchies which terminate in polynomial time, but do not guarantee finding the global solution. In this dissertation, we discuss a generic system of heuristic solutions for such problems. We describe an implementation of these methods in a package called NCVX, as an extension of CVXPY, a Python package for formulating and solving convex optimization problems. Our heuristics, which employ convex relaxations, convex restrictions, local neighbor search methods, and the alternating direction method of multipliers (ADMM), require the solution of a modest number of convex problems, and are meant to apply to general problems, without much tuning. Several numerical examples such as regressor selection, circle packing, the traveling salesman problem, and factor analysis modeling are studied. We also describe a fast optimization algorithm for approximately minimizing convex quadratic functions over the intersection of affine and separable constraints. We discuss the favorable computational aspects of our algorithm, which allow it to run quickly even on very modest computational platforms such as embedded processors. We study numerical examples including hybrid vehicle control and power converter control, and we compare our methods with existing open source and commercial alternatives. Finally we discuss how randomized linear programming heuristics can be used to solve graph isomorphism problems. We motivate our heuristics by showing guarantees under some conditions, and present numerical experiments that show effectiveness of these heuristics in the general case.