Article Loss Prevention

What is Material Risk? Asbestos in Soil

- by

The risk of harm arising from asbestos left in soil is one which has only recently been identified. We do not yet have any legal cases arising from this type of exposure but asbestos itself has a very bad history in the workplace. There are reputed to be 4000 asbestos related deaths per year.

Consequently there are many civil cases, some appealed to the Supreme Court recently, which have given guidelines as to the way the courts view the assessment of that risk, and how they may evaluate the causal risks of asbestos in soil.

There will be two general types of claims (possibly group claims), those for property damage and those for damage to health. These will be assessed under the civil law of negligence, but there will be differences for each type of claim in proving the liabilities and the causal links between the damage and the negligence. For property claims, the process is the standard approach for negligence, but for the human health cases, the courts recognise the special circumstances of asbestos exposure and impose a lower burden of proof on the claimant.

Remember that the claimants have not only to have been exposed, but in England and Wales to have progressed to full blown mesothelioma, not just the pleural plaques which precede it (as is the case in Scotland). The claimants also have to prove that the exposure was negligent. This means that not all exposure victims will have legal claims.

Then the claimants have to prove that the negligent exposure caused their condition. This is  fundamental  to all claims in negligence; you have to prove that the negligence specifically caused the damage which you suffered. However this is so difficult in cases of asbestos exposure that the courts have made an exception for such cases.

The House of Lords decided the claimants did not have to prove which negligent source or which defendant caused the problem, only that the negligent exposure of the defendant had materially increased the risk of the disease for the claimant.

The Compensation Act 2006 amended the rule to provide that each negligent defendant would be independently and equally jointly liable, no matter how much exposure they were individually liable for. But the duty did not become a statutory duty; the process is still evaluated under the common law of tort.

In 2007, the Court of Appeal considered whether material exposure could be quantified as a doubling of the risk but decided that for mesothelioma, it could not, because it would contravene the Compensation Act which talked only of “material“ risk. So material means less than a doubling of the risk. But how much is “material”?

Cases so far have all been employers’ liability claims. This year, the Supreme Court heard two claims together, concerning different types of exposure, one low level long term at work at an increase of only 18%, and the other a single instance, when as a student, school buildings were being constructed. The court held that this test applies even here to these limited exposures.

How low can the risk be before it is not material? We do not know this as yet. But there will come a time when the exposure will be seen as too insignificant to be taken into account. Lord Phillips said in the Willmore case: “I doubt whether “material risk” is ever possible to define, in quantitative terms, what for the purposes of the application of any principle of law, is de minimis. This must be a question for the judge on the facts of the particular case. In the case of mesothelioma, a stage must be reached at which, even allowing for the possibility that exposure to asbestos can have a cumulative effect, a particular exposure is too insignificant to be taken into account, having regard to the overall exposure that has taken place.”

Scientific advances as to the cause of the disease might give us a clearer view, and the law will respond to this. Lord Phillips also said“(the 2006 act)does not preclude the common law from identifying exceptions to the “material increase of risk” test, nor from holding, as more is learned about mesothelioma, that the material increase of risk test no longer applies.”

Watch this space.

Article Contaminated Land Laboratories

The problem of made ground

- by

The categorisation, analysis and reporting of ‘made ground’ is a recurring nightmare for the modern laboratory. Traditionally a by-product of land reclamation schemes, a container of the stuff can contain traces of anything from steel, concrete and brick to nappies and Coke cans – and that’s on a good day.

Ask anyone from the engineer taking samples at the coalface to the men in white coats analysing them, and you will find that there is no all-encompassing approach to deal with the ‘made ground’ conundrum. Nevertheless, with brownfield sites being universally hailed as the sustainable way forward, now, more than ever before, is the time to seriously evaluate the methods employed both on-site and in the laboratory and try to circumvent the insidious ‘no easy answer’ maxim.

Much of the confusion goes back to the introduction of the Environment Agency’s Monitoring Certification Scheme (MCERTS) for the chemical testing of soils. Any laboratory operating under this banner has to submit results that fulfil both the general requirements of ISO/IEC 17025 and the specific method validation and performance requirements of MCERTS. The latter is problematic for laboratories dealing with made ground, inasmuch as it requires samples to conform to specific sample matrices in order for the results to become accredited. For relatively unadulterated soils, this has meant the creation of soil classification categories such as ‘loamy soil’, ‘sandy soil’ or ‘clay type soil’. It is worth noting that while some geotechnical engineers may see this as a tenuous oversimplification, it is widely regarded as the best available approach and has the full endorsement of the Environment Agency and UKAS – albeit based on economical drivers. Made ground’s inherent ambiguity throws a rather obtrusive spanner in the works when faced with these basic matrices and prompts all manner of interpretive stances and questions. Some good starters for ten: can you report made ground results as accredited? Is it possible to report them as ‘unaccredited’ to make it clear to the engineer that the sample does not fall into a clear defined matrix?

It isn’t just an issue of categorisation – the whole process, from preparation to final report, is divested of any consistency as laboratories adopt their own approach by asking questions such as do we dry the sample? Do we mill the sample to uniform particle size? Do we discard anything over 2mm? Do we ignore everything that is not soil? None of these methods will provide an inaccurate result per se, but each has the potential to give a misleading picture of the site.

If, in addition to that head-scratching list of questions, you consider the fact that the commercially driven nature of redevelopment schemes has turned laboratories into high-tech, scientific conveyor belts, the complexities of the problem becomes increasingly pronounced. It is a crossroads situation reliant on good judgement, experience and, above all, a decent sample. It is impossible to overstate the critical nature of the latter point: without a comprehensive sample, the laboratory cannot do its job. In other words, it cannot capture the essence of a site’s industrial legacy and act as a signpost to the appropriate action.

Though MCERTS has to a certain extent raised the standards in the laboratory, it missed an opportunity by not offering any guidance to the geotechnical engineer on the best available techniques (BAT) for sampling, storage and transportation; nor does it elaborate on the consequences of incorrect, inappropriate or inadequate sampling. The reason the EA has put the onus on the laboratories is understandable – to allow continuity of testing pre- and post-MCERTS – but the resultant confusion and knowledge deficit, particularly with regards to sampling, is less than satisfactory.

As throwing legislation at the problem is unlikely to be constructive, the best achievable course of action is to engender a milieu of interdisciplinary compatibility fuelled by open lines of communication, intellectual communality and the symbiotic sharing of knowledge. Geoscientists should learn how to adequately describe their sample, how to make the sample manageable for the laboratory and to understand the laboratory machinations of sample preparation, analysis and reporting. By the same token, chemists should acquire some field experience, learn about the conditions engineers face on-site and educate themselves on the processes that inform geotechnical sampling techniques.

If the question of how to produce consistently accurate results from made ground is reducible to a single answer, it can only be to ask more questions: what are the limitations of the selected analytical method? If there are limitations, do they matter in this case? On what basis is the data reported? Does it match the basis on which my acceptance criteria are calculated? Has the sample data been generated in ideal conditions using ideal standards which are unlikely to represent the conditions on my site? Add a soupçon of communication, wait for MCERTS to catch up and we’re well on our way.

Andrew Buck PhD, MSc, CSci, CChem, FRSC is the Technical Director of Envirolab (www.envlab.co.uk)

Article Contaminated Land Data Management Laboratories

Contaminated Land Analysis – Introducing Doubt Into An Uncertain World

- by

Let’s be honest, to most of us in the ground engineering community, chemistry is something of a black art. It’s a subject we never properly understood at school and certainly not one we intended to revisit in our professional capacities. We can muddle through the uncertainties of soil mechanics and a few of us claim a vague understanding of finite element analysis. Imagine our horror therefore when chemistry abruptly re-entered our world in the form of contaminated land. Our inattention and tomfoolery at the back of class has suddenly come back to bite us. Hazy schoolday recollections of sodium fizzing around in the sink or the exploding magic green fountain aren’t going to get us out of this one.

And it gets worse! Chemistry is no longer even just a disagreeable side issue for many of us and on many developments it sits, gloating, athwart our critical path, knowing full well that not only do we not know the answer, we are often unsure of the right questions to ask as well.

So where is our white knight, to whom can we turn for help and enlightenment? In the past we might have turned to our laboratory for help. However over the last 15 years or so, there has been a complete rationalisation of the chemical testing market. Laboratories have tended to become bigger and more automated, offering cost effective analysis but consequently less consultancy support. Intense competition amongst the key players means that margins are so tight there is little room in modern production line chemistry for added value services. Testing has become a numbers game.

There are now several degrees of separation between the engineer and the chemist. Yet we are in fact very similar in one key respect and it is here that we close the circle. They don’t understand what we do and we don’t understand what they do.

Laboratories must adopt operating practices which enable them to make a profit under conditions of intense price competition. Their choices fundamentally affect the quality and reliability of the data they produce. We don’t even know what questions to ask and many laboratories in turn are less than forthcoming in disclosing the limitations of their data. We work together in blissful ignorance even though our interaction (or lack of it) has a critical influence on the quality of the data they produce and we then use.

Accreditation schemes such as UKAS, compliance schemes (e.g. Contest, WASP) and the recently developed MCERTS scheme championed by the Environment Agency are all designed to address quality issues in laboratory testing. However all have significant limitations which are not readily apparent and are certainly not advertised to a largely ignorant consumer.

So what’s the problem. Well the example in box 1 below illustrates this nicely. In it we have simulated total soil cadmium data from two simulated laboratories. One of the simulated laboratories is a reputable, highly regarded outfit with excellent quality control and in the case of our simulated sample it has if fact got an answer which approximates to the true value. The only downside is that quality costs and it charges £1 to undertake the analysis. The other laboratory has a less robust quality system and the cost savings allow it to charge just 50p for a cadmium determination. However its reported total cadmium concentration is in fact in this case woefully inaccurate. See if you spot the wrong answer.

Laboratory

A

B

Total Cadmium

 617.2 mg/kg Cd

 617.2 mg/kg Cd

The problem is of course that you can’t tell by looking whether data is reliable or not. Because buyers of chemistry are largely ignorant of chemistry, and the product we buy does not readily reveal its quality, then the key differentiator becomes price. Whilst there is an industry bottom line which we might say is policed by accreditation schemes such as UKAS, we should not be naïve enough to believe that this is in anyway a guarantee of a right answer. Now the punch line, and you fanatically precise engineers are not going to like this. We should recognise that even good data is not ‘correct’ in the right and wrong sense and that some degree of uncertainty is inherent in every result. Sometimes this uncertainty is very large indeed, 617.2 mg/kg could actually mean anything from about 80-1000 mg/kg and that should give us all some food for thought.

So what are the questions we should ask. Well here are some important ones for starters.

Basis, basis, basis

The basis on which you send your sample to the laboratory could be as follows. It is a cold and wet day. The wind is making life difficult and you are worried about getting caught in the traffic if you don’t get off site soon. You shovel a couple of kilograms of rubble into the bag and leave the bag by the gate for the laboratory to pick it up sometime later in the week. In a couple of weeks the laboratory (UKAS accredited as the contract specified) reports back to you and you are relieved to see the thiocyanate content is 24.7 mg/kg, just below your limit of 25 mg/kg. You’re in the clear, you can sign the site off – or can you? Have you considered these questions?

How was the sample prepared, and by whom?
What is the precision and bias of the method used?
On what basis are method precision and bias measured?
On what basis is the data reported?
On what basis is your acceptance criterion calculated?

Sample Preparation

We can’t emphasise enough how important initial sample preparation is. If it is not right then everything that comes after is wrong. Unfortunately good sample preparation is expensive, labour intensive and very repetitive – it is simply not fashionable and therefore often neglected. You will almost always find the least qualified staff in a laboratory carrying out the most important function – sample preparation.

Accreditation schemes accredit results and sample preparation does not produce a result. It is debatable therefore whether sample preparation falls within the scope of accreditation. Imagine that – you use a UKAS accredited laboratory and their single most important operation is not actually capable of being accredited and is carried out by the least qualified personnel in the company.

The Quality Control Con

Laboratory quality control focuses on the instrumental side of the analysis. QC data is usually generated from a point after samples have been prepared for analysis. Prepared QC samples or certified reference materials are finely ground, dry, inherently homogenous materials. Real samples are sun drenched, windswept, dirty, heterogeneous lumps. Don’t believe, therefore, that quoted QC data will necessarily bear any resemblance to that achievable in your samples.

For example the QC data for samples which are normally analysed wet (like cyanide) may in fact be determined on dry, ground reference soils which are spiked immediately before analysis. Doing it this way ensures excellent QC data but doesn’t really relate all that well to the true bias and recovery one might get from a mixed, wet contaminated soil.

Precision, bias, repeatability & uncertainty

Each measurement a laboratory makes is subject to any number of errors. Good laboratories minimises the impact of such errors by sound methodology and quality control procedures. You cannot however eliminate uncertainty altogether and a knowledge of uncertainty could be critical to your remediation scheme.

For example if you have a clean up criteria of 2500 mg/kg of mineral oil on a scheme and your sample shows a concentration of 2000 mg/kg you might be forgiven for breathing a sigh of relief. If you knew that the true precision of the analytical method is more like +/- 100% you might have cause to re-appraise your hasty signing off of the site.

Bias, or recovery is, in simple terms a measure of the amount you get out knowing what you originally put in. For example a laboratory quotes a UKAS accredited method recovery for DRO as 95%. Fine you think a 95% recovery is very good, and the method is UKAS accredited. What you don’t appreciate is that the recovery is quoted on a reference sample that has been dried and finely ground. In other words the recovery quoted does not account for any volatiles lost during drying and grinding.

We can however welcome the (relatively) new MCERTS scheme championed by the Environment Agency in so far as precision and bias data should now accompany all results of analysis.

How is data actually reported

Understanding the basis for reporting is really, really critical. Some samples are analysed wet, some dry, some with stones removed, some without. Some data is reported dry and some wet, some whole and some just on the fines. Data on the same sample may be reported on a different basis. Do you know on what basis your samples are analysed and reported? Do you know on what basis the acceptance criteria you use (CLEA, Dutch) are generated?

The example below illustrates this point demonstrating the range of total mercury values you can get depending upon how you choose to express the data or how that laboratory chooses to prepare your sample.

A 100g sample of a contaminated clay is submitted for total mercury analysis. It contains 500ug of mercury, and is composed of the fractions set out below. For this example we assume (fairly reasonably) that all of the mercury is present in the fines. Our acceptance criteria are the CLEA Soil Guideline Values of 8 mg/kg Hg for residential uses with plants and 15 mg/kg Hg for residential uses without plants.

Soil fines (less than 2mm diameter)

30 grams

Stones (2-10mm diameter)

20 grams

Stones (greater than 10mm diameter)

25 grams

Water

25 grams

Data reported on:-

Result (mg/kg Hg)

CLEA 8
SGV

CLEA 15
SGV

Whole sample

 5.0

pass

pass

Fines, dry

16.7

fail

fail

Whole, dry

 6.7

pass

pass

<10mm, dry

10.0

fail

pass

<10mm, wet

 6.7

pass

pass

The example illustrates a huge variation in ‘right’ answers which completely span the selected acceptance criteria. It also reveals that the same (or very similar answers) can be obtained by using completely different assumptions – a whole dry basis being very similar to a <10mm wet basis in this example. What is more worrying is that the fines dry result (arguably the most common way of determining mercury in soil) is over three times higher than the result expressed on the whole sample (arguably the true result).

Sample Homogeneity

We all know that reliable data depends upon the sample from which it was extracted. We all also know how difficult it can be to take representative samples from very mixed fill and contaminated ground. The apparent precision of laboratory data can be very misleading. In reality a result of 645.37 mg/kg lead (as Pb) doesn’t actually mean that the horizon we sampled contains a concentration of 645.37 mg/kg lead. The problem is we don’t know what it means because we haven’t estimated the variability of the sampled horizon and we haven’t got a clue about the limitations of the techniques the lab uses to prepare, extract, analyse and then correct the raw data to produce the reported result. Engineers who are used to dealing with relative certainties would be horrified to learn that the true precision of chemical data is very, very poor. In many cases the best you could expect might be orders of magnitude.

Conclusions – The Key Questions

There is no doubt that the reliability of the analytical data our industry routinely uses is seriously limited. There is also no doubt that many (perhaps most) practitioners don’t realise this. This is not because the laboratories are producing poor quality work. Rather it is a combination of the uncertainty inherent in sampling and analysis coupled with the limitations a price driven market places upon laboratories. Factor in a lack of understanding on both sides of the effect (or even existence) of such limitations and it is easy to see that we can easily find ourselves skating on thin ice without even realising it.

What are the answers then? Well the answers are really a series of questions we should routinely ask ourselves when assessing our methods, our laboratories and their data.

1. Laboratories broadly use the same analytical equipment. What gives to allow some laboratories to be a lot cheaper than others? 2. What are the limitations of the selected analytical method? There are always limitations. Do they matter in this case?
3. Absolutely critical. What is the basis on which my data is reported? Does it match the basis on which my acceptance criteria are calculated?
4. Is the laboratory QC data realistic or has it been generated in ideal conditions using ideal samples which are unlikely to represent the conditions on my site.

If you can make a reasonable attempt at answering these questions you will be a long way to understanding the basis on which your data has been generated and in turn you will be reasonably confident when interpreting your data and able to take due account of the uncertainty that exists in your data.

If you can’t immediately answer these questions then you don’t know the basis on which you are interpreting your data/running your computer model/applying your acceptance criteria/remediating your site/providing your client with a collateral warranty. And it’s as fundamental as that.

Richard Puttock Partner
Michael Dinsdale Associate
Peter Brett Associates

Article Contaminated Land Laboratories

MCERTS

- by

The Environment Agency’s Monitoring Certification Scheme for the Chemical Testing of Soils:
What it is. How it affects you. What you need to do.

MCERTS Monitoring Certification Schemes were first introduced in industrial sectors with regulated processes that resulted in stack emissions. The scheme requires those companies to deliver monitoring results that are “valid, reliable and accurate”. To get to this position depends on using the appropriate resources – correct test methods, competent personal, accredited organisations and suitable equipment and planning.

The MCERTS scheme for chemical testing of soils was introduced by the Environment Agency to support their regulatory activities and make informed, quality assessments on the management of contaminated land under a number of regimes, including, Part IIa of the Environmental Protection Act 1990, Pollution Prevention and Control Regulations 2000 and the Waste Management Licensing Regulations 1994.

The scheme is applicable to all testing laboratories and procurers of analytical services, where results generated for the chemical testing of soil are submitted to the Agency. In order to gain accreditation on the scheme, laboratories are required to have their processes, essentially test methods, in a quality management framework, by both the United Kingdom Accreditation Service (UKAS) to the international standard ISO 17025 and also MCERTS requirements.

There are increasing pressures on businesses to comply with Environment Agency regulations and European and international standards. Using a laboratory with MCERTS accreditation alleviates some of this pressure because it guarantees the proper use of suitable methods, standards, services and equipment, trained and qualified personnel, quality assurance and quality control all leading to reliable data. MCERTS accreditation also assures users that the laboratory meets performance standards set out in current international standards and the growing requirements of EC directives.

Failure to meet the regulations can be costly, both financially and to a company’s reputation. An MCERTS accredited laboratory assures the user that they have met standards in a number of areas including:

  • The selection and validation of test methods

  • Sampling pre-treatment and preparation

  • The estimation of measurement uncertainty

  • Participation in proficiency testing schemes

  • The reporting of results and information

The benefits of the scheme include:

  • Providing assurance to stakeholders of the quality of data from testing

  • A level playing field, based on the Agency’s requirements, is established

  • Identifying that the chemical testing of soil is a critical component in producing defensible data for regulatory purposes.

In order to guarantee reliable data from the chemical testing of soils and therefore reassurance that risks are minimised, procurers of testing should:

  • Ensure the chemical analysis results submitted to the Agency for regulatory purposes conform to MCERTS requirements.

  • Check that the laboratory conducting the testing has MCERTS accreditation for all the parameters requiring analysis. Accreditation is given on a parameter-by-parameter basis. If they do not have the correct accreditation sub-contracting of the test required to another MCERTS laboratory may be required. If a suitable laboratory does not appear to be available, contact the Environment Agency for advice.

  • Check that the test methods employed by the laboratory are appropriate and fit for purpose in terms of the parameter, the Critical level of interest (CLI) and the matrix. The CLI may be a soil guideline value or a regulatory limit.

  • Check with the laboratory that the sampling processes, preservation and transportation are appropriate.

  • In collaboration with your chosen laboratory, have complete audit trails available that address aspects such as sample location, depth of sample, date and time of sample, reference identity and the laboratory used.

The MCERTS scheme for the chemical testing of soils was phased in, but has been fully operational since 1 March 2005. Therefore, all data for regulatory purposes should now be to the MCERTS standard. Laboratories and the procurers of testing need to work together to ensure that the test data provided meets the requirements and satisfies the needs of the ultimate client.

Cliff Billings Group Technical & Quality Manager STL UK

=======================================

EA’s position on MCERTs

From 1st March 2005, the Environment Agency has required accreditation to our Monitoring Certification Scheme (MCERTS) where laboratory soil testing results are submitted to us as part of a regulatory regime for which we have statutory responsibility.

We strongly recommend that MCERTS accredited methods are used for soil testing in activities to do with site remediation, whether carried out on a voluntary basis or to comply with planning requirements. This is particularly important in relation to any waste management issues on the site.

Jackie Harrison Environment Agency

Contaminated Land Working Group Meetings

In recent meetings of the Contaminated Land Working Group, it has been clarified that the EA is a consultee but not a Statutory Regulator for planning applications. This means that MCERTs data may not always be required at the planning stage. Although the EA recommend MCERTS, the final decision is up to the Local Authority.

Some AGS Members feel that all tests should be to MCERTS so that the reports can be used at a later date. At present, the EA is expected to take a pragmatic approach to historical data obtained before the introduction of MCERTS and take account of whether the laboratory is now accredited, and other relevant factors. However, this may not always be the case, particularly for data collected after March 2005, and the need to ‘future proof’ data should be seriously considered.

=======================================

Meet NHBC Requirements with MCERTS

The NHBC welcomes MCERTs accredited testing and supports it’s use in association with robust and representative soil sampling strategies when investigating sites affected by contamination. It brings transparency and consistency to the analytical testing techniques and encourages discussion between the consultants and testing laboratories which can only be a positive step forward.

Article Contaminated Land Laboratories

The Extension of MCERTS to Chemical Testing Of Soils – An Update

- by

In January 2003, issue number 45 of the AGS Newsletter contained an article by Bruno Guillaume, of Arup Geotechnics, who outlined the MCERTS performance standard for the chemical testing of soils. The following is an update, and a view from an analytical chemist`s perspective.

On the Environment Agency website reference has been made to the fact that the Agency is aware that it will take time for laboratories to gain approval through the appropriate accreditation process. An eighteen month period, starting from March 2003, has been given for laboratories to bring their soil testing methods up to the MCERTS standard.

During this period laboratories reporting data to the Agency have as a minimum to be accredited to the ISO 17025 standard for the soil test methods. It is also recommended that tests should have a brief method description together with estimates of bias and precision. From September 2004 only data from laboratories that have been accredited to ISO 17025 for MCERTS will be accepted.

Since the last article in the newsletter, Version 2 of the MCERTS standard has been published, and this was available from February 2003. The standard highlights particularly important areas, namely contract review, bias and precision targets, quality control( both internal and external), method validation, and uncertainty of measurement. Important differences from the first version are the exclusion of expected limits of detection for methods, and the inclusion of an improved protocol for validation.

The issues can be confusing but the standard simply aims to establish a level playing field in a competitive market, based on the Agency`s requirements, and to set a minimum acceptable performance. In short the data received by the laboratory`s customers must be accurate, reliable and comparable.

The analysis of soil is complex in terms of the chemistry involved. It aims to determine both macro and trace components in a matrix that is, quite often, dirty in both a physical and chemical context. There is a need to analyse for trace organic and metallic contaminants in soils that contain large quantities of other industrial materials, such as oil or tar, in a background that also contains high concentrations of naturally occurring, or artificially polluted, inorganic compounds.

We all use “parts per million” as routine terminology, but the significance is commonly ignored. 1 part per million is more easily visualised as 1 grain of salt in a swimming pool. When we talk of the concentrations of polynuclear aromatic hydrocarbons (PAH), an important environmental parameter, we often refer to micrograms per kilogram, which is three orders of magnitude lower.

The contaminated land testing industry has grown very quickly, and methodologies have been borrowed from other more well established areas of analytical chemistry, such as food or potable water. The only industry standard for analysis of soils in the UK are the robust and technically sound ” British Gas Methods “, but even these were not designed to tackle the lower end of detection, and do not take advantage of some of the more modern developments in analytical chemistry.

MCERTS effectively defines a standard for the performance of analytical methods, and includes the requirements of ISO 17025 in terms of certification of instrument performance, approved competency of personnel and the accreditation of laboratory procedures and organisation. It means that it is no longer sufficient that the laboratories follow a rigorous UKAS quality system in line with the international standard, but that the methodologies must also be demonstrated as fit for purpose.

The Environment Agency has not, in its standard, adopted the principle of prescriptive methods, as has been the example in the USA, through the so-called EPA procedures. This approach can commit the industry to inappropriate analytical techniques, a long time in their reform once committed to paper, and takes away the flexibility of developing new improvements for the industry as a whole.

It cannot be relied upon that environmental specialists, requiring the services of an analytical laboratory, will have the depth of technical to knowledge to understand the concepts of analytical bias or precision. MCERTS is designed to take away the need for such expertise.

Another variable that stops a customer from being able to compare “apples with apples” is the limit of detection (LOD) quoted. This can vary widely depending on how the laboratory defines it. A sound statistical principle is to use three times the standard deviation associated with a blank, or a sample with a very low concentration of the determinand of interest. This is all carried out interspersed with other standards and samples over eleven separate days. Other lesser definitions than this one seem to describe a “better” LOD, but mislead the customer into thinking they are getting an improved service, and can give false positive concentrations on soils where none of the contaminant actually exists.

All of these concerns are addressed by the MCERTS standard. Precision and bias must be of an acceptable standard, as must LOD. “Recoveries”, namely what happens when a soil is spiked with known amounts of the material of interest and is reanalysed, are examined in the standard to ensure acceptable performance. The validation must be carried out on three completely different soil types with two spiking levels, and include the use of certified reference materials wherever possible. Detailed methodologies, together with a prescribed uncertainty of measurement must also be given.

The Contract Review is the point at which the client`s needs must be understood, and the point at which the laboratory must document them. What does ” Total PAH” mean, or “Total TPH”, and what does the client consider to be the critical level of interest? This is an area quite often poorly addressed, and to which the standard lends some priority.

The laboratory`s quality control also comes under close scrutiny. At least 5% of the resources allocated to a test must be used to ensure validation. In addition the laboratory must participate in as many of the acknowledged external proficiency tests as is appropriate, such as Contest, Aquacheck, and the SPH test scheme. The results of these must be readily available for inspection by the client.

It is generally recognised amongst the community of analytical laboratories that there is a real challenge in order to be able to comply with the new version. The standards relating to bias and precision and, in particular, the guideline that “the limit of detection usually regarded as being fit for purpose is 10% of the concentration regarded as the critical level of interest” are extremely demanding. There are some method improvements required within the industry before these levels of performance can be achieved. Most laboratories, however, will feel a relief that any ambiguity is now removed, so that everyone can compete to provide a well defined product, and be able to market its expertise without confusion.

Whilst addressing the vagaries of analytical results the Environment Agency has also acknowledged the uncertainty associated with other areas, and is considering certification schemes to address field aspects, including sampling. Other subjects, for example the suitability of leachability tests, toxicity assessments and the bioavailability of metals need to be topics for guidance by the regulator.

Article Contaminated Land

Our man in Europe..

- by

Note by: Peter Rodd, JacobsGIBB representative on the AGS Committee and Contaminated Land Working Group who attended a meeting of the ISO/TC 190 Committee in Snekkersten, Denmark as the UK representative

The BSI asked the AGS to propose a representative to serve on their TC 190 EH/4 committee that is involved in the harmonisation of standards in the EC and internationally for Soil Quality. The standards being worked on are largely aimed at soil in the soil science sense but soil is defined as all material above bedrock thus geotechnical and contamination issues are also addressed. The BSI required an expert in the physical properties of soil and I was proposed and accepted.

I attended the next meeting of the EH/4 committee in September 2001 and was asked to go to the annual meeting of ISO/TC190, in early October, to represent the BSI and attend the sessions of SC5 (considering physical properties of soil) and its working groups.

JacobsGIBB allowed me the time to attend the Meeting in Snekkersten, a small town about 30 – 40 miles north of Copenhagen. I chose my route to the meeting via Malmo and then by rail over the new double decker bridge (road over rail) between Sweden and Denmark. To my surprise the train went through to Snekkersten (one stop before the end of the line at Helsingor, home to Hamlet’s Castle) without the need to change.

The first session attended was for SC5/WG3 looking at standards for water content. The working group was dealing with two standards. One of these; BS ISO 11461: 2001, Soil Quality – Determination of soil water content as a volume fraction using coring sleeves – Gravimetric method, had been recently issued as a full standard and was not discussed.

The second document, ISO/DIS 16586, Soil Quality – Determination of soil water content as a volume fraction on the basis of known dry bulk density – Gravimetric method, is at a late stage of development. (DIS – Draft International Standard). Comments from member countries were discussed and adopted where appropriate. One issue was a conflict between the two documents – BS ISO 111461 contains a note suggesting that drying at 105oC for samples containing significant organic mater will not greatly affect the result, ISO/DIS 16586 suggests it will. This will be resolved when the recently issued standard comes up for review.

The next session attended was for SC7/WG6. The working group is considering three standards; ISO/AWI 21268-1 Soil Quality – Leaching procedures for subsequent chemical and ecotoxicological testing of soil and soil materials – Part 1: Batch test using a liquid to solid ratio of 2 l/kg dry matter, Part using 10 l/kg and Part3: Up-flow column test.

A large part of the session was taken up with a presentation of results using different extraction procedures. The use of end-over-end shaker or a roller table did not seem to affect the results and so the use of either will be permitted. There did, however, appear to be a system error between the extraction procedures used by the laboratories. As a result further work is required before the standards can move forward. The chairman also asked for feedback from the member countries on the type of containers they recommend. (feedback to PGR please).

(Note feedback on the items discussed should be sent to Peter Rodd   pgr@mpg-ctrl.com)