Skip to main content
eScholarship
Open Access Publications from the University of California

LBL Publications

Lawrence Berkeley National Laboratory (Berkeley Lab) has been a leader in science and engineering research for more than 70 years. Located on a 200 acre site in the hills above the Berkeley campus of the University of California, overlooking the San Francisco Bay, Berkeley Lab is a U.S. Department of Energy (DOE) National Laboratory managed by the University of California. It has an annual budget of nearly $480 million (FY2002) and employs a staff of about 4,300, including more than a thousand students.

Berkeley Lab conducts unclassified research across a wide range of scientific disciplines with key efforts in fundamental studies of the universe; quantitative biology; nanoscience; new energy systems and environmental solutions; and the use of integrated computing as a tool for discovery. It is organized into 17 scientific divisions and hosts four DOE national user facilities. Details on Berkeley Lab's divisions and user facilities can be viewed here.

Deep Generative Models for Fast Photon Shower Simulation in ATLAS

(2024)

Abstract: The need for large-scale production of highly accurate simulated event samples for the extensive physics programme of the ATLAS experiment at the Large Hadron Collider motivates the development of new simulation techniques. Building on the recent success of deep learning algorithms, variational autoencoders and generative adversarial networks are investigated for modelling the response of the central region of the ATLAS electromagnetic calorimeter to photons of various energies. The properties of synthesised showers are compared with showers from a full detector simulation using geant4. Both variational autoencoders and generative adversarial networks are capable of quickly simulating electromagnetic showers with correct total energies and stochasticity, though the modelling of some shower shape distributions requires more refinement. This feasibility study demonstrates the potential of using such algorithms for ATLAS fast calorimeter simulation in the future and shows a possible way to complement current simulation techniques.

Artificial Intelligence for the Electron Ion Collider (AI4EIC)

(2024)

The Electron-Ion Collider (EIC), a state-of-the-art facility for studying the strong force, is expected to begin commissioning its first experiments in 2028. This is an opportune time for artificial intelligence (AI) to be included from the start at this facility and in all phases that lead up to the experiments. The second annual workshop organized by the AI4EIC working group, which recently took place, centered on exploring all current and prospective application areas of AI for the EIC. This workshop is not only beneficial for the EIC, but also provides valuable insights for the newly established ePIC collaboration at EIC. This paper summarizes the different activities and R&D projects covered across the sessions of the workshop and provides an overview of the goals, approaches and strategies regarding AI/ML in the EIC community, as well as cutting-edge techniques currently studied in other experiments.

Software Performance of the ATLAS Track Reconstruction for LHC Run 3

(2024)

Charged particle reconstruction in the presence of many simultaneous proton–proton (pp) collisions in the LHC is a challenging task for the ATLAS experiment’s reconstruction software due to the combinatorial complexity. This paper describes the major changes made to adapt the software to reconstruct high-activity collisions with an average of 50 or more simultaneous pp interactions per bunch crossing (pile-up) promptly using the available computing resources. The performance of the key components of the track reconstruction chain and its dependence on pile-up are evaluated, and the improvement achieved compared to the previous software version is quantified. For events with an average of 60pp collisions per bunch crossing, the updated track reconstruction is twice as fast as the previous version, without significant reduction in reconstruction efficiency and while reducing the rate of combinatorial fake tracks by more than a factor two.

The present and future of QCD

(2024)

This White Paper presents an overview of the current status and future perspective of QCD research, based on the community inputs and scientific conclusions from the 2022 Hot and Cold QCD Town Meeting. We present the progress made in the last decade toward a deep understanding of both the fundamental structure of the sub-atomic matter of nucleon and nucleus in cold QCD, and the hot QCD matter in heavy ion collisions. We identify key questions of QCD research and plausible paths to obtaining answers to those questions in the near future, hence defining priorities of our research over the coming decades.

Defining weather scenarios for simulation-based assessment of thermal resilience of buildings under current and future climates: A case study in Brazil

(2024)

In response to increasingly severe weather conditions, optimization of building performance and investment provides an opportunity to consider co-benefits of thermal resilience during energy efficiency retrofits. This work aims to assess thermal resilience of buildings using building performance simulation to evaluate the indoor overheating risk under nine weather scenarios, considering historical (2010s), mid-term future (2050s), and long-term future (2090s) typical meteorological years, and heat wave years. Such an analysis is based on resilience profiles that combine six integrated indicators. A case study with a district of 92 buildings in Brazil was conducted, and a combination of strategies to improve thermal resilience was identified. Results reflect the necessity of planning for resilience in the context of climate change. This is because strategies recommended under current conditions might not be ideal in the future. Therefore, an adaptable design should be prioritized. Cooling energy consumption could increase by 48 % by the 2050s, while excessive overheating issues could reach 37 % of the buildings. Simple passive strategies can significantly reduce the heat stress. A comprehensive thermal resilience analysis should ultimately be accompanied by a thorough reflection on the stakeholders’ objectives, available resources, and planning horizon, as well as the risks assumed for not being resilient.

Energy flexibility quantification of a tropical net-zero office building using physically consistent neural network-based model predictive control

(2024)

Building energy flexibility plays a critical role in demand-side management for reducing utility costs for building owners and sustainable, reliable, and smart grids. Realizing building energy flexibility in tropical regions requires solar photovoltaics and energy storage systems. However, quantifying the energy flexibility of buildings utilizing such technologies in tropical regions has yet to be explored, and a robust control sequence is needed for this scenario. Hence, this work presents a case study to evaluate the building energy flexibility controls and operations of a net-zero energy office building in Singapore. The case study utilizes a data-driven energy flexibility quantification workflow and employs a novel data-driven model predictive control (MPC) framework based on the physically consistent neural network (PCNN) model to optimize the building energy flexibility. To the best of our knowledge, this is the first instance that PCNN is applied to a mathematical MPC setting, and the stability of the system is formally proved. Three scenarios are evaluated and compared: the default regulated flat tariff, a real-time pricing mechanism, and an on-site battery energy storage system (BESS). Our findings indicate that incorporating real-time pricing into the MPC framework could be more beneficial to leverage building energy flexibility for control decisions than the flat-rate approach. Moreover, adding BESS to the on-site PV generation improved the building self-sufficiency and the PV self-consumption by 17% and 20%, respectively. This integration also addresses model mismatch issues within the MPC framework, thus ensuring a more reliable local energy supply. Future research can leverage the proposed PCNN-MPC framework for different data-driven energy flexibility quantification types.

Cover page of Anthropogenic heat from buildings in Los Angeles County: A simulation framework and assessment

Anthropogenic heat from buildings in Los Angeles County: A simulation framework and assessment

(2024)

Anthropogenic heat (AH), i.e., waste heat from buildings to the ambient environment, increases urban air temperature and contributes to the urban heat island effect, which leads to more air-conditioning energy use and higher associated waste heat during summer, forming a positive feedback loop. This study used a bottom-up simulation approach to develop a dataset of the annual hourly AH profiles for 1.7 million buildings in Los Angeles (LA) County for the year 2018 aggregated at three spatial resolutions: 450 m, 12 km, and the census tract. Building AH exhibits strong seasonal and diurnal patterns, as well as large spatial variations across the urban areas. Building AH peaks in May and reaches a maximum of 878 W/m2 within one of several AH hotspots in the region. Among the three major AH components (surface convection, heat rejection from HVAC systems, and zonal air exchange), the surface convection component is the largest, accounting for 78% of the total building AH across LA County. Higher AH is attributed to large building density, a high percentage of industrial buildings, and older building stock. While AH peaks during the day, the resulting ambient temperature increases are much larger during the night. During the July 2018 heatwave in LA County, building AH (excluding the surface component) leads to a daily maximum ambient temperature increase of up to 0.6 °C and a daily minimum ambient temperature increase of up to 2.9 °C. It is recommended that reducing summer building AH should be considered by policy makers in developing mitigation measures for cities to transition to clean energy while improving heat resilience.

Cover page of Estimating geographic variation of infection fatality ratios during epidemics.

Estimating geographic variation of infection fatality ratios during epidemics.

(2024)

OBJECTIVES: We aim to estimate geographic variability in total numbers of infections and infection fatality ratios (IFR; the number of deaths caused by an infection per 1,000 infected people) when the availability and quality of data on disease burden are limited during an epidemic. METHODS: We develop a noncentral hypergeometric framework that accounts for differential probabilities of positive tests and reflects the fact that symptomatic people are more likely to seek testing. We demonstrate the robustness, accuracy, and precision of this framework, and apply it to the United States (U.S.) COVID-19 pandemic to estimate county-level SARS-CoV-2 IFRs. RESULTS: The estimators for the numbers of infections and IFRs showed high accuracy and precision; for instance, when applied to simulated validation data sets, across counties, Pearson correlation coefficients between estimator means and true values were 0.996 and 0.928, respectively, and they showed strong robustness to model misspecification. Applying the county-level estimators to the real, unsimulated COVID-19 data spanning April 1, 2020 to September 30, 2020 from across the U.S., we found that IFRs varied from 0 to 44.69, with a standard deviation of 3.55 and a median of 2.14. CONCLUSIONS: The proposed estimation framework can be used to identify geographic variation in IFRs across settings.

Cover page of Net fluxes of broadband shortwave and photosynthetically active radiation complement NDVI and near infrared reflectance of vegetation to explain gross photosynthesis variability across ecosystems and climate

Net fluxes of broadband shortwave and photosynthetically active radiation complement NDVI and near infrared reflectance of vegetation to explain gross photosynthesis variability across ecosystems and climate

(2024)

A significant challenge in global change research is understanding how vegetation interacts with the environment to influence ecosystem gross primary productivity (GPP) through carbon assimilation. One emerging objective is to consistently predict GPP fluctuations worldwide by establishing a robust scaling relationship between GPP measured at flux towers and satellite spectral reflectance data. However, a major hurdle in achieving this goal is the discrepancy in spatial resolution between early satellite measurements and eddy flux measurements. By using a large set of growing season data covering 100 site-years in North and Central America, we explored the potential of transforming incident and reflected shortwave (Rg) and photosynthetically active radiation (PAR) measurements into a broadband normalized difference vegetation index (NDVI) and near-infrared (NIR) reflectance of vegetation (NIRv) which simultaneously explains the GPP variability. We found that the broadband NDVI and NIRv derived from Rg and PAR measurements at the daily time scale were highly correlated with Planet Fusion, Landsat-8/9, and Sentinel-2 narrowband NDVI and NIRv across a wide range of climate and ecological gradients. The differences between satellite and broadband NDVI and NIRv were found to be significantly associated with soil background variations, phenological stages, water stress and signal saturation of broadband NIR reflectance at high biomass. The seasonal variability of broadband NDVI and NIRv remarkably captured the seasonality of vegetation phenology, evaporative fraction, GPP and rainfall in different ecosystems. Although saturation of GPP at high NDVI was evident, a linear relationship between broadband NIRv times incident PAR versus GPP indicated the effectiveness of NIRv-based approach to capture the hidden light use efficiency impacts on GPP. Our study concludes that inexpensive measurement of Rg and PAR components can provide reliable information on NDVI, NIRv, and GPP uninterruptedly. This enhances the sensing capability of flux tower sites without requiring additional spectrometer measurements. The proposed in-situ vegetation indices make a compelling case on using radiation signals for handshaking between ecosystem-scale measurements and remote sensing observables relevant to carbon uptake.