An interactive experimental and numerical data generation and visualization tool for geotechnical earthquake engineering applications
********** | |
Katerina Ziotopoulou | |
Civil & Environmental Engineering / UC Davis |
Project's details
An interactive experimental and numerical data generation and visualization tool for geotechnical earthquake engineering applications | |
One of the most constant pursuits of geotechnical earthquake engineers is that of predicting the response of soil masses and any associated infrastructure to earthquake loading. Soil deposits are highly spatially variable, expensive to test in-situ, and challenging to sample. At the same time, earthquakes are unpredictable in nature and impose high demands on our geosystems. The approach we follow is to build experience as well as robust numerical tools based on experimental and case history evidence, and theoretical/analytical understanding. This means constantly testing reduced scale specimens in laboratories under cyclic loads that resemble the earthquake demands and in parallel building numerical models that are continuously implemented and calibrated against any lab data as well as against case histories. Despite the fact that our discipline has made progress in properly curating and disseminating datasets (experimental or numerical), there is no tool that can inform its users about existing datasets (material, type of test, properties, responses obtained etc.), their quality, as well as any simulation parameters that have been fit against it in the past to avoid replicating the effort. Our lab works on these various components and is seeking a tool that will optimize the way we approach experimental data curation, processing, and visualization as well as provide a connection between the experimental datasets and numerical simulation tools such that the overall process becomes more efficient. | |
Students working on this project will assist in the design and implementation of an interactive tool capable of acting as a pre-processing, post-processing, storage, and visualization unit. This tool will be used by graduate student researchers at UC Davis as well as researchers and practitioners nationally and internationally. Ideally the tool could be an extensible application such that we can implement more types of geotechnical tests and simulations, alongside with their particular datasets later on. The tool should be able to visualize the data in a flexible and interactive format and we expect that it will be tested and improved with our feedback. We already do some of the desired functions via Matlab codes but they are inefficient, not integrated, and not-sustainable. The tool will be expected to integrate into of the Geotech labs in UC Davis as well as be distributed via the DesignSafe (https://www.designsafe-ci.org/) platform. The sketch in the attachment demonstrates an idealization of what we are seeking for. |
|
We envision a working prototype of the tool alongside with a manual for users as well as the tool’s source in Github. The resulting program could be stand-alone (on Mac and/or PC) or could be an html. Specifics will include at least: 1. Ability to input raw data files and user-defined experimental parameters from our experimental device, process them, and export plots of different responses. 2. Ability to store and index new data based on an interactive interface. 3. Ability to pre-process files for single-element numerical simulations and to invoke an external software (available) to batch run them and subsequently compare them to one or more target experimental sets (from 1). 4. Ability to store calibration parameters and files from 3 and inform future users of past work to avoid duplicating effort. 5. Ability to display calculated parameters on an easily readable “dashboard” output screen. We would also welcome the opportunity to write a paper and include the team as authors in order to disseminate this tool broadly to the geotechnical community in research and practice. | |
Experience with Python, MatLab or a similar system for input and analysis of continuously measured data. The ideal team will have experience with database APIs (retrieving data), pre-processing data, and data visualization tools. Signal processing skills and knowledge of machine learning algorithms are also desirable but not required. | |
********** | |
30-60 min weekly or more | |
Open source project | |
Attachment | Click here |
No | |
Team members | N/A |
N/A | |
N/A |