%{search_type} search results

271,635 catalog results

RSS feed for this result
Book
1 online resource (v, [1], 30, [1] pages) : illustrations (some color).
Collection
Government Information United States Federal Collection
Changes in regulation enacted in 2013 have enabled the Alaska Fisheries Science Center's Fishery Monitoring and Analysis Division (FMA) and the Alaska Regional Office's Sustainable Fisheries Division to work collaboratively on an Annual Deployment Plan (ADP). Each ADP documents how the National Marine Fisheries Service (NMFS) plans to deploy observers into fishing activities for the coming year under the limits of available funding. Draft ADPs are presented to the North Pacific Fishery Management Council (Council) during September - October and are finalized in December. The sampling design for observer deployment has two elements: how the population is subdivided (i.e., stratification schemes) and how available samples are allocated (i.e., allocation strategies). Here the relative performance of 10 alternative sampling designs (at the primary sampling unit- the trip) are compared in support of the draft 2018 ADP. These alternative sampling designs consisted of the combination of two stratification schemes (gear-type only or gear-type × tendering activity), two metrics upon which to base optimizations [one consisting of discard of groundfish with Prohibited Species Catch (PSC) of Pacific halibut and the other consisting of the prior and PSC of Chinook salmon], and three allocation strategies (no optimization, a "hurdle" approach to optimization, and a optimization only). All optimization allocations incorporate three variables measured over the past 3 years: variance in the metric, the average cost of observing a trip, and the number of trips. Total afforded sample size is determined by the available budget and the average cost of observing each trip. Resulting selection rates derive from sample size, allocation weightings and the anticipated fishing effort which was defined as the most recent complete year of data. The total number of observer days that can be afforded is 4,062 which represents a 33% increase from 2017. Gap analyses that examine the chance of at least one or three observed trips in a NMFS Area × gear type combination (cell) were used as a performance metric. Gap analyses illustrated that stratifications based on gear type (3 strata) were outperformed by stratifications based on gear type × tendering activity (6 strata). Potential gaps in observer coverage appear to be mostly concentrated in areas with low fishing effort with fewer than 12 trips in a cell. Simulations were performed to measure the potential impact of unknown vessel participation in electronic monitoring (EM). The variability in gap analyses from randomized differences in EM participant vessels was relatively minor (less than 10% probability of observation shifts across deployment designs). The NMFS recommended an observer deployment design for the draft 2018 ADP that has gear type × tendering stratification and uses a "hurdle" approach to sample allocation wherein 15% base coverage is obtained first across all strata and the remainder is optimized according to the variance in the metric of discarded groundfish catch combined with PSC Pacific halibut and Chinook salmon. At their October 2017 meeting the Council did not support the NMFS recommendation and instead proposed a five strata design with optimal sample allocations based on discarded groundfish catch and PSC of Pacific halibut only. Comparisons between the NMFS and Council recommended designs were included in the final 2018 ADP. [doi:10.7289/V5/TM-AFSC-364 (https://doi.org/10.7289/V5/TM-AFSC-364)]
Book
1 online resource.
Fully automated driving will require intelligent systems capable of understanding, reacting to, and interacting with the intricate complexities of the real world. With the onset of autonomous driving it becomes increasingly necessary to develop advanced tools for establishing trust in intelligent safety systems that act without or despite human input. This thesis presents novel contributions to simulation-based validation, including human driver behavior and sensor models, distributions over driving scenes, and a new technique for the accelerated validation of advanced automotive active safety systems such as autonomous vehicles. Advances to human driver behavior models include the introduction of behavioral cloning models based on Bayesian networks that better capture driver behavior over short horizons. A general input architecture for deep sensor models is introduced and used to develop a stochastic model over an automotive radar's power field. Original contributions are made to the representation of distributions over driving scenes and situations, which must capture a variable number of traffic participants on arbitrary roadways. Finally, this thesis introduces a new method for accelerated validation using importance sampling over clusters of critical situations, prioritizing simulation of critical scenes and avoiding countless simulations of benign driving scenarios while backing out the correct performance statistics.
Book
1 online resource (92 [that is, 191] pages) : illustrations, maps Digital: text file.
Collection
Stanford Research Data
Simultaneous brain and spinal cord functional MRI is emerging as a new tool in study of the central nervous system, but is challenging. Poor B0 homogeneity and small size of spinal cord are principal obstacles to this nascent technology. We extend a dynamic shimming approach, first posed by Finsterbusch for brain/spinal cord, by shimming per slice. We shim dynamically by optimization of linear gradients and frequency offset for each and every slice in order to minimize off-resonance for both brain and spinal cord. Simultaneous acquisition of brain and spinal cord fMRI is achieved with high spatial resolution in spine by means of echo-planar RF pulse for reduced FOV. Brain slice acquisition is full FOV. T2*-weighted images of brain and spinal cord are acquired with high clarity and minimal observable image artifacts. fMRI experiments reveal task consistent activation in motor cortices, cerebellum, and C6-T1 spinal segments. Consistent activation in both brain and spinal cord is observed at individual levels, not only group level. High quality functional results are obtained for a motor task. Because reduced FOV excitation is applicable to any spinal cord section, future continuation of these methods holds great portent.
Book
1 online resource ([408] pages) : color illustrations, color maps
Collection
Government Information United States Federal Collection
Book
1 online resource (v, [1], 58, [1] pages) : color illustrations, color maps.
Collection
Government Information United States Federal Collection
A genetic analysis of the prohibited species catch (PSC) of chum salmon (Oncorhynchus keta) collected during 2016 from the federally managed walleye pollock (Gadus chalcogrammus) trawl fishery in the Bering Sea and from the federal groundfish fisheries in the Gulf of Alaska (GOA) was undertaken to determine the overall stock composition of the sample sets. Samples were genotyped for 11 microsatellite markers from which stock contributions were estimated using the current chum salmon microsatellite baseline. In 2016, one genetic sample was collected for every 30.6 chum salmon caught in the Bering Sea midwater trawl fishery. The evaluation of sampling in the Bering Sea based on time, location, and vessel indicated that the genetic samples were representative of the total chum salmon PSC in the Bering Sea. The majority of the 114 chum salmon samples from the A-season were from Northeast Asia (37%) and Eastern GOA/Pacific Northwest (PNW) (37%) stocks. Based on the analysis of 2,701 chum salmon collected throughout the B-season, the largest stock groups in the catch were Eastern GOA/PNW (35%) and Northeast Asia (31%), followed by Western Alaska (19%), Southeast Asia (9%), Upper/Middle Yukon (5%), and Southwest Alaska (< 2%) stocks. The chum salmon caught in the Bering Sea in 2016 shared general patterns of stock distribution with those from past years, but differed by some finer-scale spatiotemporal strata. Of the 473 chum salmon samples from the GOA groundfish fisheries, the highest proportion was from Eastern GOA/PNW (93%) stocks, similar to previous years. [doi:10.7289/V5/TM-AFSC-366 (https://doi.org/10.7289/V5/TM-AFSC-366)]
Book
1 online resource.
Lung cancer is the leading cause of cancer deaths worldwide, with 1.7 million deaths per year. Moreover, lung cancer is comprised of several histological subtypes, with non-small cell lung cancer (NSCLC) making up ~85% of the cases. Presently, the most important prognosis of NSCLC is the stage of disease. However, a vast majority of patients have locally advanced or metastatic disease at the time of diagnosis. Despite the number of current therapies consisting of chemotherapy, radiation, and surgery, patients with locally advanced NSCLC have heterogeneous outcomes. Recently, the advent of targeted therapies against specific genetic mutations or rearrangements have improved response rates and overall survival but there remains an unmet need to uncover novel genes involved in this disease. We have applied bioinformatics approaches to better elucidate the role of various genes or gene signatures in non-small lung cancer in several capacities -- 1) to uncover gene expression signatures to improve risk stratification and outcome in those diagnosed with early stage NSCLC; 2) to identify de novo genomic fusions and breakpoints from targeted paired-end DNA sequencing data; and 3) elucidate the function of a novel splicing factor mutation in non-small cell lung cancer. In the first application, we obtained gene expression profiles from 1106 non-squamous NSCLCs for the generation and internal validation of a 9-gene molecular prognostic index (MPI). This was validated on an independent cohort of FFPE tissues from 98 NSCLC patients. As a second application, we demonstrate a practical and robust identification method of DNA rearrangements resulting in gene fusions through the development of a Fusion And Chromosomal Translocation Enumeration and Recovery Algorithm (FACTERA). Some of these recurrent fusions involving ALK, ROS1, RET, and NTRK1 have been identified in NSCLC, leading to the development and approval of targeted therapies. Lastly, the third application identifies a novel recurrent splicing factor mutation in non-small lung cancer. We have characterized the binding and splicing properties of this splicing factor and functional changes associated with the mutation. The approaches described here can be applied to uncover other gene or gene expression signatures in NSCLC and be expanded to other cancer indications.
Collection
Master's Theses, Stanford Earth
Geomechanical simulators can be used to predict production-induced subsidence. However, the uncertainty in these predictions depends on the uncertainty in the flow and geomechanical parameters, such as permeability and Young's modulus. In this thesis, we develop a derivative-free optimization procedure for calibrating flow and geomechanical parameters. The method entails the use of the iteratively coupled flow-geomechanics module in Stanford's Automatic Differentiation General Purpose Research Simulator (AD-GPRS) combined with the mesh adaptive direct search (MADS) optimizer. Multiple sets of history matched parameters are generated using the randomized maximum likelihood (RML) procedure. Both production and displacement (geomechanical) data can be assimilated within our framework. Data types considered in this work include well production data, surface displacement data and well displacement data, and the impact of these different types of data on predictions and parameter calibration is assessed. We first consider a three-layer model with no overburden rock. For this case, we show that assimilating either type of displacement data is useful for calibrating the harmonic average Young's modulus as well as permeability in each layer. Assimilation of either type of displacement data reduces uncertainty in both surface displacement and production predictions. Assimilation of production data, by contrast, improves permeability calibration and reduces uncertainty in production forecasts, but not in displacement predictions. We also show that history matching surface displacement data does not provide calibration of individual layer Young's moduli, though it does enable accurate calibration of the harmonic average Young's modulus over all layers. History matching well displacement data provides calibration of individual layer Young's moduli. We next consider a more realistic (though two-dimensional) geomechanical model that includes overburden rock and bedrock. For this case we again demonstrate that both surface and well displacement data are useful for reducing uncertainty in surface displacement predictions and for estimating the harmonic average Young's modulus over all reservoir layers. Well displacement data are shown to again enable the calibration of individual layer Young's moduli.
Collection
Master's Theses, Stanford Earth
Compositional simulation is commonly applied to evaluate enhanced oil recovery and carbon storage processes. Compositional simulation is expensive because the models may include multiple components along with complex phase behavior. Thus, when compositional systems are optimized, the optimization algorithms must be robust and efficient. In this work, we assess different procedures for well control optimization in oil-gas compositional systems. The optimization algorithms considered include a standalone adjoint-gradient-based method, a standalone particle swarm optimization (PSO) procedure, and several hybridizations of these two approaches. We introduce and evaluate two criteria for switching from PSO to adjoint-gradient-based optimization in the hybrid procedures. These approaches are referred to as deterministic switching, where we switch after a specified number of PSO iterations, and adaptive switching, where the algorithmic switch is based on PSO performance. All methods are implemented and run within Stanford’s Unified Optimization Framework (UOF). The performance of the standalone and hybrid procedures is assessed for two example cases involving well control optimization in Gaussian and channelized geological models. The time-varying bottom-hole well pressures are treated as the optimization variables, and net present value (NPV) is optimized. Each method under consideration is run at least nine times to account for the stochastic nature of PSO and the sensitivity to the initial guess in the adjoint-gradient method. This allows us to draw meaningful conclusions on the relative performance of the various procedures. The two example cases presented in this study demonstrate that the hybrid PSO-adjoint-gradient procedure utilizing adaptive switching outperforms the standalone algorithms in terms of average optimal NPV. By plotting the average optimal NPV versus computational effort for the different methods, we can construct a Pareto front. For both cases, the Pareto-optimal procedures include standalone PSO (which provides lower average NPV but requires less computation) and the hybrid PSO-adjoint-gradient method using adaptive switching. The latter gives higher average NPV but at the cost of increased computational effort.
Book
1 online resource.
Neutrinos have been shown to have non-zero mass, however how they generate their minuscule masses is an open question. One well motivated possibility is that neutrinos have Majorana masses, for which the most sensitive test is the observation of neutrinoless double-beta decay. The half-life of this neutrinoless mode is much slower than that of the observed two-neutrino mode of double-beta decay, which is many orders longer than the age of the universe, thus searches are heavily background dominated. In this work discusses two, completely distinct, methods to improve discrimination of neutrinoless double-beta decay, of xenon-136, from backgrounds. The first method is through training new discriminators to more fully exploit the observed topological information in EXO-200 to distinguish neutrinoless double-beta decay from the most common backgrounds. The second method is to enable the observation of barium-136 resulting from double-beta decay for a future generation detector via a hardware-centric approach. One path requires extraction from high pressure gas to vacuum of heavy ions from similarly heavy medium with high efficiency. Work on a prototype extraction apparatus for the nEXO collaboration and lessons learned are presented here.
Book
1 online resource (43 pages) : color illustrations, 1 color map
Collection
Government Information United States Federal Collection
Book
1 online resource.
In an 1883 address, Frederick Douglass outlined how the escalating criminalization of black people had become a key process for sustaining a racial caste system post-slavery, acutely observing the "general disposition in this country to impute crime to color." In this dissertation, I argue that the ongoing development of the "crime to color" relationship that Douglass emphasized raises important philosophical challenges for our understanding of how intentional action unfolds in the social sphere. Contending that philosophical accounts of human agency must incorporate an analysis of sociality and power, Missing in Action: Agency, Race, & Recognition proposes two interventions. First, I argue that the conceptual relationship between "criminality" and "blackness" is more than a problem of racist concept associations, and has instead become conflated and embedded in dominant U.S. discourses and understandings. I introduce "racial conflation" as a frame through which we can analyze the persistent yoking of the concepts, "blackness" and "criminality." Racial conflation triggers others' reconstruction of black action into criminal action, and conjures fictive intentions to rationalize those reconstructions. The problem of racial conflation does not merely reflect unjust, unfair, or untrue characterizations of black people, but is instead a matter of the structure of concepts themselves, making the relationship a foundational theoretical problem for philosophy of action. Second, I build on Hegel's theory of the sociality of intention and Maria Lugones' analysis of power and agency to propose a framework for intentional action that can account for racial conflation. I argue that the intentions of black agents are vacated by others and replaced with what I describe as "phantom intentions, " or explanations constructed through the logic of racial conflation. With a specific focus on cases of black women's agentic strategies, I enlist feminist of color theory and feminist epistemology to propose a methodological framework to more effectively theorize intentional action, arguing that we must not only ask "how much" agency are subjects granted in conditions of oppression, but also "what kind" of agencies are projected onto subjects so that these conditions of oppression are rendered legitimate?.
Collection
Stanford Research Data
This a snapshot of the NeuroVault.org database. It include all public collections of statistical maps deposited in NeuroVault as long as they were linked to an external publication. This snapshot is an attempt to improve chances of long term persistance of data deposited in NeuroVault.org.
Book
1 online resource.
Solving time-harmonic wave equations in high frequency regime is an important yet numerically challenging problem. This dissertation presents three fast preconditioners for time-harmonic high frequency wave equations under different problem settings. The first preconditioner adopts a recursive approach from the moving PML sweeping preconditioner for the 3D Helmholtz equation, which reduces both the setup and the application costs to linear while maintaining the iteration number to be frequency insensitive. The second one is an enhancement of the sparsifying preconditioner for periodic structures by taking the local potential information into account, which improves the accuracy of the preconditioner and reduces the iteration number to be essentially independent of the problem size. The third one assembles the key ideas from the first two works, which results in a highly efficient preconditioner for the Lippmann-Schwinger equation, where both the setup and the application costs are linear. Moreover, numerical results show that the iteration number grows only logarithmically as the frequency increases. To the best of our knowledge, this is the first method that achieves near-linear cost to solve the Lippmann-Schwinger equation in 3D high frequency regime.
Book
1 online resource.
The lateral ventricle subventricular zone (SVZ) is a frequent site of high-grade glioma (HGG) spread, and tumor invasion of the SVZ is a predictor of a worse clinical prognosis. A range of HGG types invade the SVZ stem cell niche, including both adult glioblastoma and pediatric high-grade gliomas such as diffuse intrinsic pontine glioma (DIPG). The cellular and molecular mechanisms mediating this frequent invasion of the SVZ are poorly understood. Here, we demonstrate that neural precursor cell (NPC):glioma cell communication underpins the propensity of glioma to colonize the SVZ. SVZ NPCs secrete chemoattractant signals toward which glioma cells home. Biochemical and proteomic analyses of factors secreted by SVZ NPCs revealed a discrete list of candidate proteins. Necessity and sufficiency testing implicated the neurite outgrowth-promoting factor pleiotrophin, along with three required binding partners (secreted protein acidic and rich in cysteine (SPARC), SPARC-like protein 1, and heat shock protein 90B), as key mediators of this chemoattractant effect. Pleiotrophin protein expression is strongly enriched in the SVZ, and knockdown of pleiotrophin expression starkly reduced glioma invasion of the SVZ in the adult murine brain. Pleiotrophin, in complex with the three binding partners, activated the Rho/Rho kinase pathway in DIPG cells, and inhibition of Rho kinase resulted in decreased DIPG invasion toward SVZ neural precursor cell-secreted factors. These findings demonstrate a pathogenic role for neural precursor cell:glioma interactions and potential therapeutic targets to limit glioma invasion.
Book
1 online resource.
Materials search and discovery is crucially important in condensed matter physics. Besides experimental trial-and-errors, there exist two types of methods to guide materials explorations: "from theory" that starts from theoretic analysis and numerical simulations, and "from data" that leverages massive materials data via statistical machine learning. I will present one work for each of both methods of materials discovery in this dissertation. Firstly, I will discuss the theoretic proposal and materials realization of anti-ferromagnetic Dirac semimetal. I will specifically show how a non-symmorphic crystal symmetry stabilizes a four-fold degenerate point in the electronic band structure of an anti-ferromagnetic system that is invariant under the combination of time-reversal and inversion symmetry, thus realizing massless Dirac fermions as low energy excitations. Secondly, I will talk about how to learn atoms' properties from extensive materials data, inspired by ideas from computational linguistics. I will present analysis of the constructed atom vectors, as well as their applications in data-based materials prediction using machine learning.
Book
1 online resource (vi, 71 pages) : illustrations (some color), maps (some color).
Collection
Government Information United States Federal Collection
Collection
Undergraduate Theses, School of Engineering
Heart disease is the world’s leading cause of mortality, accounting for over 7.4 million deaths each year. Consequently, there has been immense interest in the medical and scientific community to develop technologies to help regenerate injured hearts. 3D printing of biological materials (i.e., 3D bioprinting) is an innovative technology that may be applied to help us understand and treat cardiac disease. In my thesis work, I utilized 3D bioprinting to create a viable and functional artificial cardiac construct. Human umbilical vein endothelial cells (HUVECs) and human induced pluripotent stem cell-derived cardiomyocytes (hiPSC-CMs) were encapsulated in 10% gelatin methacrylate (GelMA) hydrogels and subsequently bioprinted into 3D tissue constructs. Analysis of the printed tissues over time demonstrated cell viability as assessed by their metabolic activity in culture. Cell migration and proliferation were observed in the printed HUVEC tissues 7 days post printing and beating was observed in the hiPSC-CM tissues 3 days post printing. In addition, I demonstrated the endothelialization of acellular printed scaffolds using a perfusion bioreactor system. In the future, I plan to utilize the 3D tissue printing and bioreactor platform for in vitro applications such as personalized drug cardiotoxicity screening and in vivo applications such as generating a cardiac patch for transplantation into an animal model of myocardial infarction.
Collection
Undergraduate Theses, School of Engineering
The medial prefrontal cortex (mPFC) is important in guiding learned, reward-seeking behavior through the integration of relevant information from multiple brain regions. Multiple studies have used the head-fixed Go/No-Go task to assess mPFC function in decision-making because rodents learn the task quickly and it is compatible with several imaging strategies. However, this paradigm has two inherent flaws: 1) head-fixation does not emulate natural behavior, and 2) the motivational state of the animals is ambiguous because they are forced to perform task trials at an experimenter-defined schedule. To assess the function of mPFC in decision making in a more naturalistic context, I have designed and successfully built a behavioral system that robustly reads out a two-alternative-forced choice (2AFC) behavior in freely moving mice. First, this behavioral system allows for freely moving mice to perform learned choice behavior in a custom-designed behavioral box that is dimensioned to mimic its home cage. Second, mice are able to self-initiate task trials without external constraints placed upon them, which is made possible by motion sensing hardware. This behavioral system is also built to be flexible, allowing for easy control of task parameters and integration of external devices such as imaging cameras. The construction of this system combines sensory systems, MATLAB software, and the Arduino platform. By using this behavioral set up, I have effectively trained mice in the 2AFC task, and have recently begun to perform in vivo Ca2+ imaging using the Inscopix mini-endoscopic microscopy system. By correlating the real-time activity patterns of genetically/anatomically-defined mPFC neurons with freely moving choice behavior, I have found cell type specific activity that correlates with specific outcomes of 2AFC.
Collection
Undergraduate Honors Theses, Graduate School of Education
Despite the emergence of formal “critical whiteness studies” in academia over the last twenty years, there currently remains a dearth of research on the teaching of whiteness studies. Additionally, there remains a lack of formal engagement with the topics of race and whiteness, especially by white students in universities across the United States. This paper investigates the potential for implementing anti-racist critical whiteness studies at Stanford University, seeking to understand how the field of whiteness studies is taught on college campuses and what Stanford students think about studying race and whiteness. The paper analyzes results from a survey of 200 Stanford students that aimed to collect student reactions to studying race and whiteness, as well as students’ impressions of their own qualification and comfort in discussing race with friends and family. Results indicate a strong desire among the Stanford student body to study race, and provide an argument for Stanford University to establish critical whiteness studies within a framework of anti-racism education. The paper proceeds to analyze eighteen syllabi for classes on whiteness, concluding that while there are many ways to teach about whiteness, there is a distinct and common approach which uses high-level concepts like race as a social construction with intersectional implications and a universalized understanding of whiteness to promote the visibility of whiteness to students in the class. There remain incredible tensions around analyzing racism without centering or privileging whiteness. However, taken together, Stanford student’s endorsement of race and whiteness studies and the sample syllabi’s formal structure for teaching about anti-racist critical whiteness provide a case for Stanford University to implement such a curriculum.