I am trying to display a spectrum and it's upper and lower deviation. Therefor I have a Class, "SpectrumClass" where the cooridantes are stored in a DataTable (dtCoords).
The spectrum I am showing is an average of several other spectra. Whith those spectra I am calculating the standard-deviation as follows (all spectra have the same amount of datapoints):
DataTable dt = new DataTable();
DataColumn columnX = new DataColumn("X");
DataColumn columnY = new DataColumn("Y");
dt.Columns.Add(columnX);
dt.Columns.Add(columnY);
SpectrumClass stdSpectrum = new SpectrumClass(0,"Standard deviation",dt);
//Iterate through each Intensity value
for (int i = 0; i < specs[0].dtCoords.Rows.Count; i++)
{
double rShift = 0;
double IntensitySum = 0;
//Calculate std-Points for each Intensity value
foreach(SpectrumClass spec in specs)
{
IntensitySum += Convert.ToDouble(spec.dtCoords.Rows[i][1], System.Globalization.CultureInfo.InvariantCulture);
rShift = Convert.ToDouble(spec.dtCoords.Rows[i ][0], System.Globalization.CultureInfo.InvariantCulture);
}
//Averge-value (Mittelwert)
IntensitySum /= specs.Count;
//Varianz
double variance = 0;
foreach (SpectrumClass spec in specs)
{
//Here IntensitySum is the average value
variance += ((Convert.ToDouble(spec.dtCoords.Rows[i][1], System.Globalization.CultureInfo.InvariantCulture) - IntensitySum) * (Convert.ToDouble(spec.dtCoords.Rows[i][1], System.Globalization.CultureInfo.InvariantCulture) - IntensitySum));
}
//Std-Point
double stdValue = variance /(specs.Count-1);
stdValue = Math.Sqrt(stdValue);
stdSpectrum.dtCoords.Rows.Add(rShift.ToString(System.Globalization.CultureInfo.InvariantCulture), stdValue.ToString(System.Globalization.CultureInfo.InvariantCulture));
}
return stdSpectrum;
Later when it comes to the visual part I create two new Series which should contain the upper and lower deviation. Here I am simply adding or aubtracting the Y-value for the according point of the average spectrum.
for (int i = dt.Rows.Count; i > 0;i--)
{
//Obere STD grenze
if (expMeas.Count > 1)
{
ExpChart.Series["STD+"].Points.AddXY(double.Parse(stdSpec.dtCoords.Rows[i - 1][0].ToString(), System.Globalization.CultureInfo.InvariantCulture)
, double.Parse(dt.Rows[i - 1][1].ToString(), System.Globalization.CultureInfo.InvariantCulture) + double.Parse(stdSpec.dtCoords.Rows[i - 1][1].ToString(), System.Globalization.CultureInfo.InvariantCulture));
ExpChart.Series["STD-"].Points.AddXY(double.Parse(stdSpec.dtCoords.Rows[i - 1][0].ToString(), System.Globalization.CultureInfo.InvariantCulture)
, double.Parse(dt.Rows[i - 1][1].ToString(), System.Globalization.CultureInfo.InvariantCulture) - double.Parse(stdSpec.dtCoords.Rows[i - 1][1].ToString(), System.Globalization.CultureInfo.InvariantCulture));
}
ExpChart.Series["Spectrum"].Points.AddXY(double.Parse(stdSpec.dtCoords.Rows[i - 1][0].ToString(), System.Globalization.CultureInfo.InvariantCulture)
, double.Parse(dt.Rows[i - 1][1].ToString(), System.Globalization.CultureInfo.InvariantCulture));
}
And here is my problem:
I think the standard-deviation spectra (in gray) should look the same? I randomly checked standard-deviation points, they seem to be right. Could that be a mistake in one of my loops? I hope somebody experienced the same, beacuase checking throughover thousand points isn't fun
Thanks in advance.