Bonanza Creek SiteBytes 2008

Bonanza Creek LTER

Site Update (Year in Review):

The last year has seen a lot of activity as we worked to integrate a new data manager into our management team. Jason Downing was hired in September of 2007 as our site information manager and has been working hard to learn the current system and protocols as well as working to plan and implement improvement projects for the data management system.

Over the past months we have deployed a new system for data submission into the BNZ database. The old system relied upon personal or e-mail communications with the data manager to receive information about metadata requirements and progress through the submissions process. It was easy for submissions to become misplaced or abandoned and fell apart with personnel changes. This process first involved creating a new metadata submission spreadsheet with documentation and examples. Next we developed an on-line submission interface that links with our database to create relational records of each upload action for future processing and tracking. Training was also developed and provided to researchers, staff, and students on the basics of metadata, the BNZ metadata form, and how to use our data submission system. Feedback from this training was positive and it will be an annual workshop offered to our personnel.

In an attempt to become more familiar with each of the senior investigators, their research, and their data files in our system, the data manager and a supervising Co-PI for the site met individually with each senior researcher. These meetings were a forum to discuss status of their data in the database, provide information and resources to one another, and to foster improved working relationships between the scientists and the data management staff. In follow-up to these meetings, the data manager has begun to have field/lab visits with each of the scientists in their home 'element'. These are proving to be very beneficial in increased development of the working relationships and interactions among staff.

The big hardware activity and upgrade was to replace our outdated web application server with a new and multifunctional server providing virtual platforms for various operations. This new server currently provides a virtual web site server (Linux), and an ArcGIS server (Windows). The virtual platform allows for easy development and upgrading for services. The next addition will be a dedicated SAS interface for quality control operations for implementation with our streaming climate data input.

Currently our database holds data, metadata, and generated EML (Levels 3- 5) for 245 distinct data files. The EML is generated with a PERL script developed by Inigo SanGil from the LNO to efficiently produce versioned metadata for harvest. Through the increased IM outreach activities we are working to boost the availability of electronic versions of published materials and establish links between the listed publications and actual data files currently in our system.

Site Plans (Year to Come):

Our current website is what was salvaged from the previous server but has many legacy issues that make it ripe for an upgrade and we would like to have this be part of the server upgrade process. The virtual platform will allow for simultaneous development and production on the same server with limited changeover complications. Our big step will be to end our use of Coldfusion as the web-database linkage in favor of a more simple and universal format like PHP. The new web interface will provide a visual and structure facelift to our site as well as hasten the development of more useful interactive interfaces for generation of files and graphics from the database.

The ArcGIS server is only in the beginning stages of development but will replace our current ArcIMS web service that will no longer be supported. The spatial data server will continually become a more useful and critical component in our site and data management activities.

There are also additional sites that now have radio communications and can be added into the streaming weather data system. With this we want to implement a system for filtering all the streaming data through a SAS system that will run basic filtering and will produce graphics for the technician staff to view and inspect for sensor equipment issues.

CI Need at BNZ:

The major CI needs for BNZ are software and staffing to implement real-time QA/QC filtering and flagging of streaming data as it enters our database. As we have transitioned to more loggers sending data straight into our database it has become possible to loose connection with the quality of the incoming data or be unaware of serious sensor issues. Also, the volume of data and wide range of possible issues make playing catch-up after the fact extremely difficult. These core datasets are supposed to be the foundation of data that supports our other scientific research but as we try to use the data we are finding significant quality issues that require significant amounts of attention before the data can be utilized. These issues cause an undue burden on field and data staff as they try to make this data available and in a usable state. This deficiency limits our ability to contribute some basic data to our site scientists as well as to network level collaborations.

Comments

2008 Site Byte

2008 Site Byte