%{search_type} search results

2,974 catalog results

RSS feed for this result
Book
1 online resource (13 ) : digital, PDF file.
We present the results of an evaluation of new features of the latest release of IBM's GPFS filesystem (v3.2). We investigate different ways of connecting to a high-performance GPFS filesystem from a remote cluster using Infiniband (IB) and 10 Gigabit Ethernet. We also examine the performance of the GPFS filesystem with both serial and parallel I/O. Finally, we also present our recommendations for effective ways of utilizing high-bandwidth networks for high-performance I/O to parallel file systems.
Book
1 online resource (6 ) : digital, PDF file.
In a previous humorous note entitled 'Twelve Ways to Fool the Masses, ' I outlined twelve common ways in which performance figures for technical computer systems can be distorted. In this paper and accompanying conference talk, I give a reprise of these twelve 'methods' and give some actual examples that have appeared in peer-reviewed literature in years past. I then propose guidelines for reporting performance, the adoption of which would raise the level of professionalism and reduce the level of confusion, not only in the world of device simulation but also in the larger arena of technical computing.
Book
1 online resource.
Recent laser accidents and incidents at research laboratories across the Department of Energy complex are reviewed in this paper. Factors that contributed to the accidents are examined. Conclusions drawn from the accident reports are summarized and compared. Control measures that could have been implemented to prevent the accidents will be summarized and compared. Recommendations for improving laser safety programs are outlined and progress toward achieving them are summarized.
Book
1 online resource.
In the U.S., the increasing financial support for customer-sited photovoltaic (PV) systems provided through publicly-funded incentive programs has heightened concerns about the long-term performance of these systems. Given the barriers that customers face to ensuring that their PV systems perform well, and the responsibility that PV incentive programs bear to ensure that public funds are prudently spent, these programs should, and often do, play a critical role in addressing PV system performance. To provide a point of reference for assessing the current state of the art, and to inform program design efforts going forward, we examine the approaches to encouraging PV system performance used by 32 prominent PV incentive programs in the U.S. We identify eight general strategies or groups of related strategies that these programs have used to address factors that affect performance, and describe key implementation details. Based on this review, we then offer recommendations for how PV incentive programs can be effectively designed to mitigate potential performance issues.
Book
1 online resource.
The Institute of Medicine (IOM) of the National Academy of Sciences recently completed a critical review of the scientific literature pertaining to the association of indoor dampness and mold contamination with adverse health effects. In this paper, we report the results of quantitative meta-analysis of the studies reviewed in the IOM report. We developed point estimates and confidence intervals (CIs) to summarize the association of several respiratory and asthma-related health outcomes with the presence of dampness and mold in homes. The odds ratios and confidence intervals from the original studies were transformed to the log scale and random effect models were applied to the log odds ratios and their variance. Models were constructed both accounting for the correlation between multiple results within the studies analyzed and ignoring such potential correlation. Central estimates of ORs for the health outcomes ranged from 1.32 to 2.10, with most central estimates between 1.3 and 1.8. Confidence intervals (95%) excluded unity except in two of 28 instances, and in most cases the lower bound of the CI exceeded 1.2. In general, the two meta-analysis methods produced similar estimates for ORs and CIs. Based on the results of the meta-analyses, building dampness and mold are associated with approximately 30% to 80% increases in a variety of respiratory and asthma-related health outcomes. The results of these meta-analyses reinforce the IOM's recommendation that actions be taken to prevent and reduce building dampness problems.
Book
1 online resource (20 ) : digital, PDF file.
In order to facilitate access to the large volumes of data (multiple petabytes per year) which will be produced during data taking and Monte Carlo production at ATLAS, work has proceeded on building a system of event-level metadata to allow selections of a subset of events to use as input to an analysis. This was included in the ATLAS Computing Model and was first studied and implemented by the Physics Analysis Tools group based on the decisions of the ESD/AOD Task Force. They used tools developed and supported by the CERN IT group and the ATLAS Database group. During 2005 this structure was put through various tests and evaluations. Also, work by physicists on reconstruction and analysis led to an improved understanding of the requirements on the TAG. This report addresses the effect of these new inputs on the previous work with regard to content and the infrastructure needed to support it.
Book
1 online resource (74 ) : digital, PDF file.
This White Paper summarizes the outcome of the Town Meeting on Phases of QCD that took place January 12-14, 2007 at Rutgers University, as part of the NSAC 2007 Long Range Planning process. The meeting was held in conjunction with the Town Meeting on Hadron Structure, including a full day of joint plenary sessions of the two meetings. Appendix A.1 contains the meeting agenda. This Executive Summary presents the prioritized recommendations that were determined at the meeting. Subsequent chapters present the essential background to the recommendations. While this White Paper is not a scholarly article and contains few references, it is intended to provide the non-expert reader
Book
1 online resource (vp. ) : digital, PDF file.
No abstract prepared.
Book
1.4 Megabytes pages : digital, PDF file.
One of the widely used methodologies for describing the behavior of a structural system subjected to seismic excitation is response spectrum modal dynamic analysis. Several modal combination rules are proposed in the literature to combine the responses of individual modes in a response spectrum dynamic analysis. In particular, these modal combination rules are used to estimate the representative maximum value of a particular response of interest for design purposes. Furthermore, these combination rules also provide guidelines for combining the representative maximum values of the response obtained for each of the three orthogonal spatial components of an earthquake. This report mainly focuses on the implementation of different modal combination rules into GEMINI [I].
Book
1 online resource (90 pages ) : digital, PDF file.
The objective of this manual is to present guidelines and procedures for the preparation of new data for the Tertiary Oil Recovery Information System (TORIS) data base. TORIS is an analytical system currently maintained by the Department of Energy's (DOE) Bartlesville Project Office. It uses an extensive field- and reservoir-level data base to evaluate the technical and economic recovery potential of specific crude oil reservoirs.
Book
1 online resource.
This technical report is a summary of the progress made for "A Guidance Document for Kentucky's Oil and Gas Operators". During this quarter, the document received continued review and editing in an elec-tronic format to satisfy the United States Department of Energy (DOE). Comments received from oil and gas operators reviewing this document prompted contact to be made with the United States Environmental Protection Agency (USEPA) to develop an addendum section to provide better explanation of USEPA requirements for Class II injection wells in Kentucky.
Book
1 online resource (Pages: 11 ) : digital, PDF file.
Fundamental understanding of matter is a continuous process that should produce physical data for use by engineers and scientists in their work. Lack of fundamental property data in any engineering endeavor cannot be mitigated by theoretical work that is not confirmed by physical experiments. An engineering viewpoint will be presented to justify the need for understanding of matter. Examples will be given in the energy engineering field to outline the importance of further understanding of material and fluid properties and behavior. Cases will be cited to show the effects of various data bases in energy, mass, and momentum transfer. The status of fundamental data sources will be discussed in terms of data centers, new areas of engineering, and the progress in measurement techniques. Conclusions and recommendations will be outlined to improve the current situation faced by engineers in carrying out their work. 4 figures.
Book
ii, 7 p.
Green Library
Book
1 online resource (22 pages) : color illustrations.
Book
1 online resource (19 p. ) : digital, PDF file.
As increasing numbers of photovoltaic (PV) systems are connected to utility systems, distribution engineers are becoming increasingly concerned about the risk of formation of unintentional islands. Utilities desire to keep their systems secure, while not imposing unreasonable burdens on users wishing to connect PV. However, utility experience with these systems is still relatively sparse, so distribution engineers often are uncertain as to when additional protective measures, such as direct transfer trip, are needed to avoid unintentional island formation. In the absence of such certainty, utilities must err on the side of caution, which in some cases may lead to the unnecessary requirement of additional protection. The purpose of this document is to provide distribution engineers and decision makers with guidance on when additional measures or additional study may be prudent, and also on certain cases in which utilities may allow PV installations to proceed without additional study because the risk of an unintentional island is extremely low. The goal is to reduce the number of cases of unnecessary application of additional protection, while giving utilities a basis on which to request additional study in cases where it is warranted.
Book
1 online resource.
The derived concentration guideline level (DCGL) is the allowable residual radionuclide concentration that can remain in soil after remediation of the site without radiological restrictions on the use of the site. It is sometimes called the single radionuclide soil guideline or the soil cleanup criteria. This report documents the methodology, scenarios, and parameters used in the analysis to support establishing radionuclide DCGLs for Argonne National Laboratory's Building 310 area.
Book
PDFN
A complex geology lies beneath the Hanford Site of southeastern Washington State. Within this geology is a challenging large-scale environmental cleanup project. Geologic and contaminant transport information generated by several U.S. Department of Energy contractors must be documented in geologic graphics clearly, consistently, and accurately. These graphics must then be disseminated in formats readily acceptable by general graphics and document producing software applications. The guidelines presented in this document are intended to facilitate consistent, defensible, geologic graphics and digital data/graphics sharing among the various Hanford Site agencies and contractors.
Book
1 online resource.
This document contains a summary of the main findings from our full report entitled 'Wind Power Forecasting: State-of-the-Art 2009'. The aims of this document are to provide guidelines and a quick overview of the current state-of-the-art in wind power forecasting (WPF) and to point out lines of research in the future development of forecasting systems.
Book
1 online resource.
As random shotgun metagenomic projects proliferate and become the dominant source of publicly available sequence data, procedures for best practices in their execution and analysis become increasingly important. Based on our experience at the Joint Genome Institute, we describe step-by-step the chain of decisions accompanying a metagenomic project from the viewpoint of a bioinformatician. We guide the reader through a standard workflow for a metagenomic project beginning with pre-sequencing considerations such as community composition and sequence data type that will greatly influence downstream analyses. We proceed with recommendations for sampling and data generation including sample and metadata collection, community profiling, construction of shotgun libraries and sequencing strategies. We then discuss the application of generic sequence processing steps (read preprocessing, assembly, and gene prediction and annotation) to metagenomic datasets by contrast to genome projects. Different types of data analyses particular to metagenomes are then presented including binning, dominant population analysis and gene-centric analysis. Finally data management systems and issues are presented and discussed. We hope that this review will assist bioinformaticians and biologists in making better-informed decisions on their journey during a metagenomic project.
Book
1 online resource.
The purpose of this document is to help developers of Grid middleware and application software generate log files that will be useful to Grid administrators, users, developers and Grid middleware itself. Currently, most of the currently generated log files are only useful to the author of the program. Good logging practices are instrumental to performance analysis, problem diagnosis, and security auditing tasks such as incident tracing and damage assessment. This document does not discuss the issue of a logging API. It is assumed that a standard log API such as syslog (C), log4j (Java), or logger (Python) is being used. Other custom logging API or even printf could be used. The key point is that the logs must contain the required information in the required format. At a high level of abstraction, the best practices for Grid logging are: (1) Consistently structured, typed, log events; (2) A standard high-resolution timestamp; (3) Use of logging levels and categories to separate logs by detail and purpose; (4) Consistent use of global and local identifiers; and (5) Use of some regular, newline-delimited ASCII text format. The rest of this document describes each of these recommendations in detail.

Articles+

Journal articles, e-books, & other e-resources
Articles+ results include