2012 Standards Discussion: Difference between revisions

From canSAS
Line 86: Line 86:
=== Round Table 29 July 2012 AM ===
=== Round Table 29 July 2012 AM ===


* Regarding long-term reproducibility of measurements from standards
** Here are some fitted transmission values from the ISIS 'TK49' dPS/hPS polymer blend
**
[[Category: canSAS 2012]]
[[Category: canSAS 2012]]

Revision as of 09:54, 29 July 2012

Discussion on Standardistation

Comments Made Prior to canSAS2012

  • The following is the agenda of work posted under business for canSAS-2012. Please add comments and expand on details here:
    • Purpose and goals: Intercomparison of data measured on the same sample with different instruments and different techniques (SAXS, SANS, light scattering etc.) can prove valuable in a number of ways. In particular it aids understanding of details of the experimental methods and it can help assess reliability. In a similar way, looking at results of data reduction or analysis generated with different software can provide valuable information about performance and verification of methodology. Specifically these activities should:
      • Provide Quality Assurance/Quality Control,
      • Improve (reduce) uncertainties of SAS measurements in general,
      • Help each facility continuously improve performance and quality of data.
    • We will discuss what types of tests are interesting/important:
      • Beam intensity standards - there are several different ways to quantify this
      • Standards to test resolution
      • Absolute intensity calibrations,
      • Materials for Q calibration,
      • etc. etc,
    • Standards are not just measurements:
      • Software comparison - do we derive the same results from different computer programs?
      • Analysis methods may be similar or different (e.g. modelling versus transforms versus calculation of invariants)
      • Different procedures use different approximations - are these documented?
      • Approximations rather than the most elaborate calculations may be useful? Under what circumstances?
      • How do analysis programs interpret data? What do they assume if data (such as uncertainty or resolution) is missing?
    • Some other related issues:
      • Inelastic,
      • Multiple scattering,
      • Wavelength contamination,
      • Grazing incidence scattering - standards,
      • Detector efficiencies at different wavelengths,
      • limits in signal to noise - how weak a signal can be reliably extracted,
      • etc.
    • Outcomes needed are:
      • A written plan to sustain long term effort in this area
      • This should describe how to seed, co-ordinate and publicise “ad-hoc” projects,
      • Assess how frequently exercises can be undertaken?
      • Define good ways to disseminate/share results. This will including “advertising” projects and using them as input for other activities.
    • We should aim to define a list of 2 or 3 projects for work in the near term. This should include a plan of action and participants for each.
    • We shoud have a plan for presentation at SAS 2012. (This might just be an announcement of the plan and see who wants to participate?)
  • ARR suggests: that we might discuss how people will be able to meet the ideas in the article by Jacques et al that describes guidelines for publication of SAS data from biological macromolecules. There is an accompanying editorial. Are there ideas for modifications to these guidelines? (D. A. Jacques, J. M. Guss, D. I. Svergun and J. Trewhella 'Publication guidelines for structural modelling of small-angle scattering data from biomolecules in solution' Acta Cryst. (2012), D68, 620-626. doi:10.1107/S0907444912012073)
   PDB  I think these have shown that the current agreement is as expected withing 10 or 20% in most cases (which is all 
        the technique really claims to be good to if you read the old papers)  I think the real opportunity now is to see if we
        can go beyond that and figure out how to get agreement regularly at the 5% level.  That probably means the community
        will have to understand a lot of the subtler issues that have been shoved under the rugs to date and can come in from
        instrument hardware improvement to the analysis software improvements -- my 2c worth:-)

Discussion 28 July 2012

Agenda

The 'Standardisation' sub-group started by having a general discussion with the following agenda items:

  • Suggest what new standardisation is needed
  • Identify how best to organise activities
  • Think about ways to document results
  • What more can we learn? How can one disseminate more from previous activities

What new work is needed?

It is necessary to find a broader range of materials and samples that can be used as seconary standards so as to allow measurements with a broad range of instruments and configurations. There are boundariues imposed by count-rate, q-range etc. Users may often prefer to use standards that are related to their field of science (better understanding of any observed anomalies). Samples that are robust (physical handling, temperature, beam damage, etc.) and that can be made in adequately large quantities are desirable.

Work needs to extend beyond traditional calibration of intensity and momentum transfer/wavelength. For example it is useful to determine resolution, detector efficiency etc.

Standards for emerging techniques such as grazing incidence scattering are needed. At the moment there is little intercomparison of data in this field and relatively poor modelling of absolute intensity. The field would benefit from stable, 'standard' samples that might give calculable scattering patterns and could be compared at different facilities. It would be helpful to have samples that were appropriate to both GiSAXS and GiSANS.

Discussion and improvement of publication standards is important. There are some challenges in providing standardised deposit of data with documentation of how the data are reduced. Good practice would have appropriate metadata maintained in processed data files. Procedures for data reduction and data analysis need considerably more documentation than just program names as input parameters such as transmission, scaling, and calibration procedures/data are required to reproduce the analysis.

Organisation of Activities

  • Small-groups can work efficiently. These may need to be formed 'ad-hoc' from people that have a particular interest in a problem or field. The expansion of a particular comparison or study to larger groups can follow in a staged process.
  • The benefit of standardisation is achieved when facilities and instruments can ensure that results are exploited. Faciltating application of new calibration methods and procedures is important

Documentation of Results

Documentation is crucial. Sharing knowledge and understanding of the technique that has been gained is the perhaps the most important part of 'standardisation'. Desciptions that appear in published papers will have most impact.

Exploiting Previous Studies

(a) The round-robin activities with glassy carbon displayed interesting SANS results that should be analysed further and described. The publication on USAXS/SAXS from this material is helpful but the broader application, particularly for SANS would require interpretation of the data as regards contrast, wavelength dependence etc.

Some further work on inelastic scattering may be needed on the same samples that have been measured in the round-robin.

(b) Work on interpretation of the results of polystyrene latex round-robin is under way and an extended abstract for SAS2012 is available. A fuller paper would be useful.

Round Table 29 July 2012 AM

  • Regarding long-term reproducibility of measurements from standards
    • Here are some fitted transmission values from the ISIS 'TK49' dPS/hPS polymer blend