Welcome to the RMWQAA Website! 

Newsletter

RMWQAA is now publishing new content monthly and discontinuing the quarterly newsletter. Check back to the Newsletter page regularly for continually changing info, articles, news, and more. Archived newsletters will still be available to members.

June 20, 2017

 

The Argo Tunnel (Figure 1) was the primary drainage and ore transport tunnel from Nevadaville to Idaho Springs. It was excavated from 1893 and 1910, drained water from several mine workings, and allowed ore carts to be wheeled right up to the Argo Mill next door. Although the tunnel has not been used to transport ore since the 1940s, water still drains through it constantly. The tunnel discharge averages 275 gallons of acidic contaminated water per minute. Approximately 850 pounds of dissolved metals are released from the tunnel each day. The Argo Tunnel Water Treatment Plant began operating in April 1998, treating water from the Argo Tunnel. Flows from the Big Five Tunnel at the west end of Idaho Springs and groundwater from Virginia Canyon were added in 2006 (Figure 2).

 Figure1: Entrance to the Argo Tunnel                          Figure 2: Discharge from Virginia Canyon and Big

                                                                                                               Five Tunnel

 

The Argo Tunnel Flow Control Bulkhead (Figure 3) was completed In August 2015 at a cost of approximately $970,000. A pipe runs through the concrete plug so water treatment plant operators can regulate the flow and control water levels inside the mine pool. The tunnel has a history of surge events that released untreated mine water into Clear Creek. The primary contaminants include acidity and a host of heavy metals, including aluminum, copper, iron, manganese and zinc. 

 

Figure 3: Bulkhead                                                            Figure 4: RMWQAA group inside the tunnel

 

After flowing down from the tunnel, the influent accumulates in an equilibration basin (Figure 5) before entering the plant. From there it is mixed with recycled metal hydroxides and hydrated lime until a pH of 9.9 S.U. is achieved. The lime system includes a silo for storage (Figure 6), a slurry mix tank in the enclosure along with the silo, piping from the lime enclosure to the WTP, a day tank within the WTP and diaphragm metering pumps for feeding the lime into the treatment process. Recent upgrades converted the plant’s conventional process to a high-density sludge (HDS) process. The HDS process sends metal hydroxides into a conditioning tank where they are coated with lime and sent back through the system for up to 30 additional treatment cycles. The process is more efficient at removing metals from the water, resulting in denser filter cake and less material sent to landfills.

Figure 5: Equilibration basin                                                                     Figure 6: Lime storage silo

 

This mixture is sent to a sludge thickener where the precipitates settle by the force of gravity from the water, and the clarified water flows off the top. A polymer (Figure 7) is added in a low dose to improve settling and filtration performance. The overflow water is polished using a sand filter, and then treated with hydrochloric acid to achieve a discharge pH of approximately 8.5 S.U. (Figure 8). The precipitates are pumped from the bottom of the sludge thickener and then sent to a plate-and-frame filter press. The solid filter cake contains approximately 35-40% solids and passes testing of the Toxicity Characterization Leaching Procedure (TCLP), characterizing it as a nonhazardous waste and is disposed of in a municipal landfill.

Figure 7:  Mary Boardman showing polymer system    Figure 8: Outfall

 

Special thanks to Mary Boardman with the Colorado Department of Public Health and the Environment for giving the RMWQAA group the tour!

 

Lindie Aragon is the Chemist at the City of Westminster's Wastewater Lab. She is the head of the RMWQAA scholarship committee and coordinated the Argo tour.  

May 6, 2017 

 

Chlorophyll a. Standard Methods 10200H. EPA 445-447. Straightforward, right? Follow the method, and get the concentration of chlorophyll a in the sample. This seems straightforward, but I would argue, and many other scientists would agree, that it is not straightforward at all.

 

Chlorophyll a is a surrogate measurement for algal biomass at the community level, and is often used to evaluate the effects of nitrogen (N) and phosphorus (P) additions to a waterbody. Algae incorporate carbon (C), N, P, and other elements, leading to growth, proliferation, and more chlorophyll contained within the waterbody. While this process is fairly definitive, the composition of chlorophyll is not.

 

                                                                                          http://www.urbancultivator.net/drinking-chlorophyll

 

The photosynthetic pigments in freshwater algal cells include the chlorophylls (i.e, a, b, and c), the carotenoids (i.e., carotenes, fucoxanthin, and xanthophylls), and depending on the type of algae (i.e., cyanophyta and pyrrophyta), can also include phycobiliproteins or bacteriochlorophylls (Kirk 1994, NALMS 2017). Thus, when a lab is asked to measure chlorophyll a in a water sample, how confident are they that the reported concentration is only chlorophyll a, and not all these other components?

 

The basic chlorophyll a methods listed above try to account for the numerous other colorful components of algae (pigments) that could potentially be recognized by the spectrophotometer or fluorimeter which may confound the chlorophyll a concentration. In addition, photosynthetic pigments begin to degrade soon after collection so knowing the quantity of active versus degraded pigments is important. Acidification can account for the degradation of chlorophyll to pheophytin, but how can a lab be certain that the chlorophyll a concentration measured with one method is equivalent to the concentration measured with a different method?

 

The logical answer is a chlorophyll a standard. Chlorophyll a standards are available through a number of sources, but their actual concentration “as chlorophyll a” is not always clear. The standards are typically created through pulverization of spinach, or other chlorophyll-rich leaves, and can be in the form of a solid or liquid. Instructions on how to use the standard are rarely provided, and manufacturers provide little detailed information for their use.  Often, these standards do not result in a specific concentration of chlorophyll a, only a concentration of ground spinach leaves. Numerous labs can measure the same “standard”, but who is to say the actual concentration of chlorophyll a?

 

These uncertainties in quality assurance provided the impetus for the Chlorophyll a Round Robin Event conducted by the City of Northglenn in 2015 (Taylor, et al. 2015). The RMWQAA’s concern in this methodology is not unique, as The State of Florida reiterates the same concerns regarding the numerous available methods and the variety of ways the data can be handled in “Applicability of Chlorophyll a Methods” (FLDEP 2011). A total of 11 analytical laboratories participated in the event, each reporting their chlorophyll a result for a laboratory split sample. The “corrected” (i.e. pheophytin correction taken into account) chlorophyll a values ranged from 77.6 to 162 mg/L, while labs reporting uncorrected chlorophyll a had a much larger range from 10.8 to 178.9 mg/L. These results highlight the uncertainty in laboratory and methodological approaches and confounds the true answer.

 

While the true concentration remained elusive, the real concern comes in to play when regulatory agencies are making decisions based on these data. Total Maximum Daily Limits (TMDLs) for nutrients are being determined from the chlorophyll a data resulting in more stringent nutrient limits for dischargers to these waterbodies. Thus, the question arises… Are regulatory decisions based on data from comparable chlorophyll a methods, or are different methods being used, potentially skewing the results? Furthermore, are laboratories confident in their results for chlorophyll a and do data users understand the uncertainty associated with the results? All of these questions should be considered before implementing regulatory decisions such as a TMDL that could dramatically impact limits for dischargers.

 

References

Florida Department of Environmental Protection (FLDEP). 2011. Applicability of Chlorophyll a Methods. DEP-SAS-002/10. October 24, 2011.

 

Kirk, T.O., 1994. Light & Photosynthesis in Aquatic Ecosystems, Second Edition. Cambridge University Press, New York, NY, 509 pages.

 

North American Lake Management Society (NALMS). 2017. Chlorophyll Analysis. The Secchi Dip-In. From: http://www.secchidipin.org/index.php/monitoring-methods/chlorophyll-analysis/ accessed 5/1/17.

 

Taylor, E., Guilmette, C., Rhodes, E. 2015. Chlorophyll-A Round Robin. City of Northglenn Water Quality Laboratory.

 

Natalie Love is the Laboratory Director for GEI Consultants, Inc's Laboratory. GEI conducts Whole Effluent Toxicity (WET) testing, benthic macroinvertebrate identifications, and low level nutrient analysis for Regulation 85 and other local monitoring.

April 4, 2017  

 

Interpreting Discrepancies between Analysis Methods, Machines, and More.

               As advancements in Water and Wastewater Treatment technologies continue to emerge and regulations provide municipalities with more hurdles in meeting increasingly stringent standards, we also see potential for misinterpreting data and how it affects these complex treatment processes. This idea has not escaped us at our Water Pollution Control Facility in Greeley.

               One particular instance arises in the analysis of measuring Volatile Acids. For the last few years our lab has relied on a single method of measuring Volatile Acids, as well as using certain equipment for measuring these results. This method has been a valuable indicator when it comes to determining the health of our anaerobic digester systems, especially in the winter when we experience the most complications in the treatment process.

               Recently, with the addition of new treatment processes and new employees that see value in other methods, we have begun to examine alternatives to our original procedure. These alternatives include, but are not limited to using a different method as well as using a different instrument to analyze the data. Our findings were a bit unsettling and confusing.

               Although we have only recently begun this study and it is still in progress, we do tend to see relatively significant differences. Now, there is a need to acknowledge that it is no world-changing finding to say that different methods and different instruments can provide different results on duplicate samples. However, the point is that there are certain parameters that exhibit greater discrepancies in results, and these can be difficult to interpret and put to use for operators and analysts alike.

               Our initial investigation into this process began when we found an alternative method from the same company that provides our reagents for the original method. This new method appealed to us in that it was a simpler, less time consuming process. However, the spectrophotometer (spec) used to analyze the results was different than the spec used in our original method. Fortunately, the instrument used in the new method could be equipped with an adapter to analyze our original method, so we had the ability to analyze both methods on the same spec.

               Here’s where our troubles surfaced: we didn’t know which results to take as our final number when it came to adjusting the operation of different processes in the plant. We had been using the same method for so long and using it as our proverbial ‘canary in the coal-mine,’ that when it came to new methodologies we had no baseline or warning limits to go by. Granted, if our results were close enough between methods and could be considered ‘apples-to-apples’ we wouldn’t need to worry about this. Yet Murphy’s Law had something different to say about this, and we began to see results of the new method on the new spec at 20% and sometimes even 50% higher than the original method on the old spec. Furthermore, this nascent study has shown that the new method produces more accurate results with our stock standard when compared to the old. So, where does this leave us?

               It seems that logic would dictate that when it comes to making significant changes in operating procedures we should make our adjustments based on the original method with the support of historical trends to back the decision. This has been our stance as we fight through worrying conditions in a push to stabilize the system before a more predictable and manageable period during the spring and summer.

               Fortunately, we will be entering a transition period with the construction and installation of a new primary digester. With this new change and somewhat of a blank slate to work with, we feel justified in converting over to the newer method with its perceivably more accurate results. Suffice it to say, we will continue to run the original method in conjunction with the new method to further establish some semblance of trends to compare.

               In closing, there is a desire to point out that this is but one small case study on a topic that any and all plants can be subjected to when put under certain stresses. Whether it be new analysts coming in to a lab, new equipment or methods being instituted by necessity or preference, or even changes to method rules as seen in March’s MDL writing, variability in a lab and its methods can lead to considerable impacts on a treatment plant’s performance and the decisions made in that right.

 

Zach Dahlgren is a Water Quality Analyst at the Water Pollution Control Facility in Greeley. He has been with the City of Greeley going on 4 years and with the RMWQAA as Website Chair for over 3 years.

March 6, 2017

 

On December 15, 2016 Gina McCarthy (then United States Environmental Protection Agency Administrator) signed the final rule which included an update to 40 CFR 136 Appendix B, the determination of the method detection limit (MDL). While the rule has been signed, it is awaiting publication to the Federal Register and is therefore not officially been released for use. The current MDL procedure has been in place in 40 CFR 136 Appendix B since 1984 with little-to-no changes since. 

 

Why does the MDL method need to change? The biggest concerns around the current MDL measurement procedure is that it does not address contamination in blank samples, inadvertently driving MDLs lower than they realistically should be. Additionally, the current method does not address variability within a lab, again driving the MDLs down.

 

By definition, the MDL is “the minimum measured concentration of a substance that can be reported with 99% confidence that the measured concentration is distinguishable from method blank results.” In the current method for determining MDLs, seven spiked samples are analyzed and the MDL is calculated by multiplying the appropriate student’s T value by the standard deviation of these spiked samples.  No guidelines are provided on the frequency of MDL calculation or timeframe within which these data should be collected for calculation; therefore, variability in analytical measurements by different analysts and variability over time are not taken into account. This procedure may result in MDLs that are higher or lower than they would be if variability were taken into account. 

 

The update attempts to control some of this variability by determining MDLs from a minimum of seven spiked samples which were measured in at least three batches on three separate calendar dates. The MDL of the spiked samples is calculated using the same equation as the current method. To ensure the MDL is distinguishable from the method blanks, the new method requires a blank MDL be calculated by multiplying the appropriate student’s T value by the standard deviation of the blank samples, then adding the mean of the method blank results. The higher of the spiked MDL and blank MDL becomes the new MDL for the method. The MDL is verified annually by analyzing at least two samples quarterly.

 

Overall, these changes are a significant improvement over the current procedure; however, there are still some concerns with the new procedure. As analytical methods improve and impacts to aquatic life become more and more quantifiable, the water quality criteria continue to decrease along with the associated permit limits required to protect aquatic life. The new MDL calculation procedure is a significant improvement over the previous procedure, but by better accounting for low-level contamination through the use of blank MDLs, the MDL is likely to be driven up.

 

Higher MDLs can lead to a permitting conundrum. The reporting limit (RL) and associated permit limit must increase with the MDL which would leave it unclear whether aquatic life are being protected. Otherwise, the RL and permit limit would be adjusted down to align with the standard. By decreasing the difference between the MDL and RL, the confidence that the data are truly detected also decreases. Neither regulatory scenario is ideal as it becomes unclear whether aquatic life is truly protected, or whether the quantifiable data are truly quantifiable.

 

The work required to reduce potential contamination that may lead to higher MDLs may be too much for smaller labs to handle, resulting in the need to outsource samples to other labs with better capacity to reduce low-level contamination. By not conducting tests in-house, dischargers could be faced with increased costs.

 

The option to maintain the previous MDL as long as it is within 3 times the previous MDL reduces the legitimacy of the new method as it allows for MDLs to increase or decrease based on a decision rather than the data.

 

Is your lab ready for the new MDL procedure??

 

 

Melissa Mimna is the Water Quality Analyst in the City of Boulder's Water Quality and Environmental Services Group. Natalie Love is the Laboratory Director for GEI Consultants, Inc's Laboratory. GEI conducts Whole Effluent Toxicity (WET) testing, benthic macroinvertebrate identifications, and low level nutrient analysis for Regulation 85 and other local monitoring. This excerpt was taken from a poster presented at the Society for Environmental Chemistry and Toxicology national conference in 2015.

February 16, 2017

 

On January 26, 2017, RMWQAA hosted a free tour of the Leprino Foods Wastewater Treatment Plant. Leprino Foods Company began operations in Greeley, Colorado in 2011, bringing much needed jobs and economic benefits to Weld County. Over 430 people are employed at the factory, which helps make Leprino Foods the largest producer of mozzarella cheese in the world. The entire production supply chain is localized with over 100 semi-truck loads of milk offloaded into the factory each day.

 

The raw milk that is delivered to the plant is made up of almost 87% water. Much of this water in the milk is removed during evaporation and then condensed into condensate of whey (COW) water. By capturing this COW water, the plant can rely less on municipal water sources and return around 300,000 gallons more water than it takes in on a daily basis back into the Cache la Poudre River.

 

During the tour, we witnessed some of the by-products of the wastewater treatment process. Dried biosolids were pressed into a cake-like consistency and loaded slowly into large dump trucks which were transported to local farmers to be used as a soil amendment. Construction on a new digester was underway to allow the plant to utilize methane gas to provide energy to the plant.

 

The tour was a great chance to see different aspects of the wastewater treatment process, including the testing laboratory, and to get a look inside a large and important contributor to our local economy.

 

Danny McCausland attended the tour and is the Water Quality Supervisor for GEI's Laboratory. GEI conducts Whole Effluent Toxicity (WET) testing, benthic macroinvertebrate identifications, and low level nutrient analysis for Regulation 85 and other local monitoring.

January 26, 2017 

 

December Holiday Social Recap 

 

On December 1, the Lab Practices Committee of the RMWEA and the Rocky Mountain Water Quality Analyst Association joined forces to host a holiday social event for their members.  The event was held at the Lazy Dog in Westminster and there was a fantastic turnout, with about 40 water quality enthusiasts from local laboratories joining in the fun.  There were a number of games to keep everyone entertained, including Water Quality Family Feud, Guess the number of stoppers, and Close-up Lab Pic Trivia.  It was a good chance for networking, sharing lab stories, discussing future lab issues, and returning borrowed reagents.  RMWQAA holds social events every July and December and they are a great way to collaborate with peers… don’t miss the next one!

 

 

December 3, 2016
 

In early November, the RMWQAA education subcommittee hosted a workshop to provide analysts a more solid background on the Drinking Water and Wastewater Regulations that drive their daily work. The workshop was led by Hope Dalton who holds a strong background in regulations, education, and laboratory. Ms. Dalton discussed the history of federal regulations as well as the Colorado-specific regulations and how they are implemented. Natalie Love provided information on the Clean Water Act and Colorado wastewater regulations. There was a strong turnout from across the Front Range as well as attendees from as far away as Eagle County.

 

The regulatory workshop played a secondary role in piloting a new look for RMWQAA presentations. The goal is to provide more consistency in the training delivered by the organization. The education subcommittee is working towards providing future trainings via the internet so we can reach members in less accessible areas and those who cannot travel often. Stay tuned for future trainings.

 

 

November 15, 2016

The Importance of In-Line Instrumentation 

Approximately six years ago Metro Wastewater Reclamation District determined that it was imperative that we had a dedicated group of individuals to manage all of the in-line instruments that were engineered into several of our plant upgrade projects, especially in the secondary treatment portion of the plant.  A majority of the instrumentation is used for determining the effectiveness of biological nutrient removal, including nitrification/denitrification, and phosphorous uptake.  The reliability and accuracy of in-line instruments, at that time, had risen to a point that they provided our Treatment staff a comfort level in making real time process control decisions based on the information received from the instruments. 

 

In-Line instruments were more traditionally only used for dissolved oxygen, pH, and total suspended solids.  Now we are seeing tremendous success measuring parameters such as ammonia, nitrite, nitrate/nitrite, orthophosphate, and chlorine, with some instruments able to measure multiple parameters in sequence.  We have also had some success with in-line instruments that measure total organic carbon reliably.  The ammonia instruments are also tied into a control logics system for aeration to improve nitrification/denitrification efficiencies, while reducing the costs of aeration.

 

Although most In-Line instruments tend to be less costly than their bench top, more robust, counter parts, an in-line budget is not for the “faint of heart” when you include the original price of the instrument, replacement parts, and a dedicated staff to maintain them.  Our In-Line group is “housed” in the Analytical Services Division based on our expertise with analytical instrumentation, but is supported by many other divisions at the facility, as all instruments are tied into our control systems.  The In-Line Instrumentation Support group has grown to five Specialists covering our North, South, and Northern Treatment plants.

 

Kevin Feeley, B.S. Biology, M.B.A, is the Chief of Analytical Services and has been employed with Metro Wastewater Reclamation District for 27 years. Mr. Feeley is the former Chair of the RMWEA Lab Practices Committee, a RMWQAA board participant, and on the Red Rocks Water Quality Program Advisory Board. Outside of the water and wastewater world, Kevin holds a 2nd degree black belt in Tae Kwon Do, and is the owner of 25,000+ Hot Wheels cars.

September 8, 2016

Lead and Copper Year

In 2016 the summer Olympics were held in Rio. On the Chinese calendar, 2016 is the year of the monkey. In the United States, 2016 is a presidential election year. Here in Aurora, it is the year of lead and copper.

 

Here in the drinking water world some of our regulations are different than in the waste water world. We are required to monitor the distribution system for lead and copper levels in the water. The frequency of this monitoring varies with each utility based on the monitoring plan established by the governing body, in our case the Colorado Department of Public Health and the Environment.

 

The sampling is regulated by the lead and copper rule. A common source of lead and copper is the pipes in the home. Some older homes have lead service lines or copper pipes with lead solder. Water being a great dissolver, can leach these metals from pipes. A number of water quality parameters can be controlled to prevent leaching of metals from the pipes into the water.  One tool that can be used is the Langolier Index. Based on a formula, this number takes into consideration water quality factors, such as pH, alkalinity, and hardness to determine the likelihood of the water corroding the pipes and leaching metals into the water.

 

This year Aurora Water is conducting a lead and copper study. After what happened in Michigan, this is a hot topic, and water customers are significantly more aware of the quality of their water. This awareness motivates more customers to participate in the study.

 

In previous years it was a challenge to obtain the 50 samples without repeated customer contacts. This year is different, the response has been great.  In our third week, we had almost received all of the required samples. 

 

The study involves a few steps:

  • Notify customers – We send a letter notifying the customers on the pool list that we are conducting a lead and copper study and request their participation.  The majority of the samples are collected in July but we are flexible with the customers as long as the collection is within our study window.
  • Deliver bottles and instructions – This is time consuming and labor intensive.  A bottle is delivered to each home and placed in a secure but visible place.  The instructions for filling the bottle are included on a chain of custody form that the sampler must sign and write in the date and time the sample was taken.
  • Sampling – We are required to maintain a pool of at least 100 homes and we have to receive a minimum of 50 samples back for analysis. The sample has to be a first draw sample, after the water has been sitting in the pipes for at least 6 hours. This is the reason the customers take their own samples. This year many of our regular participants have moved so we have reached out to the new customers to make sure they understand the process so that the samples are taken correctly. 
  • Collect bottles – Once the sample is collected the customer will call and let us know. We will have any employees in the area pick-up the sample and bring it back to the lab. 
  • Analyze samples – The samples will be analyzed for turbidity, lead, and copper. The turbidity is taken when the sample is received in the lab. If the turbidity is greater than 1 the sample has to be digested prior to analysis. The sample is preserved with nitric acid and held at room temperature until it is analyzed.
  • Notify customers of results – Once the results are peer reviewed and QC checked the customers are notified of the results from the water tested from their at their home.
  • Report results to the state – Once all of the sample are analyzed the results are compiled and reported to the state health department

The action level for lead is 0.015 mg/L and copper is 1.3 mg/L. Results at this level are not an automatic violation. The results are ranked in order from smallest to largest. If the 90th percentile is above the action level that is referred to as an action level exceedance. Exceedances will result in increased monitoring frequencies. Corrosion studies may also be mandated.

 

Customers will be anticipating the results of the analysis of the water in their homes. As our study concludes this month, we are hopeful that 2016 will not turn into the year of routine monitoring for lead and copper.

 

Adele Rucker is the metals analyst for Aurora's Water Quality Control Laboratory.

August 22, 2016

Laboratory Information Management Systems (LIMS) Purchasing Process

 

Purchasing a laboratory information management system can be a daunting and overwhelming task. You start asking yourself these questions:

 

       1.    Where do I start?

       2.    What do I really need?

       3.    How much will it cost?

       4.    Who will help set it up?

       5.    What features do I need/want?

       6.    Will it really be everything that I want it to be?

 

Don’t let these questions scare you away from the purchase. Our laboratory purchased our first LIMS in 2001. Prior to that, we entered all of our data in excel spreadsheets. Excel spreadsheets work great until your boss starts coming to you with requests for data from 5 years ago and they want the monthly averages, maximums, and minimums. Or they want to compare the total suspended solids from January 2010 to January 2016. You can retrieve that data from excel spreadsheets but it may take hours or days to locate and calculate the data. A LIMS is a powerful data storage warehouse that can hold years of data. The data is readily available with just a few keystrokes on the computer. In 2001, our laboratory was in need of a better system, so we purchased our first LIMS. It worked great for many years, but in 2014, we decided that we needed a more powerful system that both our environmental and wastewater laboratories could purchase and use together.  

 

In 2014, we started researching LIMS vendors and set-up several web demos. We wanted a system that would work for both labs as well as fitting our individual needs. For example, our wastewater laboratory needed a system that would be able to interface with our operations and industrial pretreatment databases, and the environmental lab needed an application that was capable of entering data in the field. We all met as a group and mapped out the current LIMS flow and how we wanted the future LIMS to flow through each step from sample log-in to reporting. This identified areas that we wanted to streamline throughout the process.

 

During this time, we brought our IT department into the discussion since they would play a major part in installing and maintaining the system. They assisted with the request for proposal and were involved every step of the way. Below is a detailed timeline for this project:

  • April 2015- Request for proposal was issued
  • May 12, 2015- Pre-proposal conference with vendors to discuss specifications
  • May 19, 2015- Questions from vendors were due
  • June 9, 2015- Request for proposals were due
  • July 7, 2015- Demonstrations from vendors (on-line)
  • July-October 2015- Worked with IT and City Attorney to draft contract and completed the Council memo.  
  • September 2015- Created a LIMS administration team which consisted of employees from the wastewater lab, environmental lab, and IT; the team met bi-weekly throughout the process
  • October 27, 2015- Council approved the purchase of the new LIMS
  • November 10, 2015- Kickoff meeting with new LIMS vendor, LIMS administrator, IT, purchasing and implementation team
  • November 15, 2015- Conference call with the vendor’s project manager and the City’s LIMS administrator to discuss the project schedule and items that we needed to submit to the vendor such as: new client checklists, static data tables, and instrument interfaces
  • December 2015- Static data tables were uploaded to the vendor’s ftp site
  • January 2016- New LIMS was installed on the administrator’s computer; initiated bi-weekly conference calls with LIMS administrator and vendor
  • February 2-4, 2016- First on-site training and system set-up
  • March 8-10, 2016- Second on-site visit and training with the vendor; training on sample entry, quality control, label printing, calculations, and initial set-up of instrument interfaces
  • April 5-8, 2016- Third on-site training which was customized for each lab; additional work on software and instrument interfaces, label printing, calculations, and mobile devices
  • April-June 2016- Staff parallel tested the new system with the old system; started using the barcode scanners, auto-logging samples, and working with the instrument and software interfaces
  • June 29, 2016- Vendor removed practice data in the new LIMS
  • July 1, 2016- Stopped logging in samples in the old system and started entering samples in the new system
  • August 2016- IT uploaded old databases to the vendor’s ftp site and 10 years of data will be archived
  • September 2016- Four analysts will travel to the vendor’s headquarters for four days of on-site training

There is still work to be done, but we are enjoying all the new features that our LIMS has to offer. When you think of taking the plunge, just remember that it can be a long journey but a journey well worth taking.

Lesa Julian is the Wastewater Laboratory Supervisor for the City and County of Broomfield.

  • 25 Oct 2011 8:09 AM | Deleted user

    Click here to download

    Congrats to Parker Water and Sanitation

    Analyst of the Year - Kathy Wagner

    Compliance Toolkit

    Certification Exam

Powered by Wild Apricot Membership Software