2012 Standards Discussion: Difference between revisions
From canSAS
No edit summary  | 
				AdrianRennie (talk | contribs)  No edit summary  | 
				||
| Line 1: | Line 1: | ||
*The following is the agenda of work posted under business for [[canSAS-2012]].  Please add comments and expand on details here:  | |||
** Purpose and goals: Intercomparison of data measured on the same sample with different instruments and different techniques (SAXS, SANS, light scattering etc.) can prove valuable in a number of ways.  In particular it aids understanding of details of the experimental methods and it can help assess reliability.  In a similar way, looking at results of data reduction or analysis generated with different software can provide valuable information about performance and verification of methodology. Specifically these activities should:    | ** Purpose and goals: Intercomparison of data measured on the same sample with different instruments and different techniques (SAXS, SANS, light scattering etc.) can prove valuable in a number of ways.  In particular it aids understanding of details of the experimental methods and it can help assess reliability.  In a similar way, looking at results of data reduction or analysis generated with different software can provide valuable information about performance and verification of methodology. Specifically these activities should:    | ||
*** Provide Quality Assurance/Quality Control,    | *** Provide Quality Assurance/Quality Control,    | ||
Revision as of 13:11, 1 June 2012
- The following is the agenda of work posted under business for canSAS-2012.  Please add comments and expand on details here:
- Purpose and goals: Intercomparison of data measured on the same sample with different instruments and different techniques (SAXS, SANS, light scattering etc.) can prove valuable in a number of ways.  In particular it aids understanding of details of the experimental methods and it can help assess reliability.  In a similar way, looking at results of data reduction or analysis generated with different software can provide valuable information about performance and verification of methodology. Specifically these activities should:
- Provide Quality Assurance/Quality Control,
 - Improve (reduce) uncertainties of SAS measurements in general,
 - Help each facility continuously improve performance and quality of data.
 
 - We will discuss what types of tests are interesting/important:
- Beam intensity standards - there are several different ways to quantify this
 - Standards to test resolution
 - Absolute intensity calibrations,
 - Materials for Q calibration,
 - etc. etc,
 
 - Some other related issues:
- Inelastic,
 - Multiple scattering,
 - Wavelength contamination,
 - Detector efficiencies at different wavelengths
 - limits in signal to noise - how weak a signal can be reliable extracted;
 - etc.
 
 - Outcomes needed are:
- A written plan to sustain long term effort in this area
 - This should describe how to seed, co-ordinate and publicise “ad-hoc” projects ,
 - Assess how frequently exercises can be undertakem
 - Define good ways to disseminate/share results. This will including “advertising” projects and using them as input for other activities.
 
 - We should aim to define a list of 2 or 3 projects for work in the near term. This should include a plan of action and participants for each.
 - We shoud have a plan for presentation at SAS 2012. (This might just be an announcement of the plan and see who wants to participate?.)
 
 - Purpose and goals: Intercomparison of data measured on the same sample with different instruments and different techniques (SAXS, SANS, light scattering etc.) can prove valuable in a number of ways.  In particular it aids understanding of details of the experimental methods and it can help assess reliability.  In a similar way, looking at results of data reduction or analysis generated with different software can provide valuable information about performance and verification of methodology. Specifically these activities should: