2

I have some data that I know its exact structure. It has to be inserted in files second by second. The structs contain fields of double, but they have different names. The same number of struct have to be written to file every second

The thing is .. Which is a better appraoch when it comes to reading the data

1- Convert the Structs to bytes then insert it while indexing the byte that marks the end of the second

2- Writing CSV data and index the byte that marks the end of second

The data is requested at random basis from the file. So in both cases I will set the position of the FileStream to the byte of the second.

In the first case I will use the following for each of the struct in that second to get the whole data

_filestream.Read(buffer, 0, buffer.Length);

            GCHandle handle = GCHandle.Alloc(buffer, GCHandleType.Pinned);
            oReturn = (object)Marshal.PtrToStructure(handle.AddrOfPinnedObject(), _oType);

the previous approach is applied X number of times because there's around 100 struct every second


In the second case I will use string.Split(',') then I will fill in the data accordingly since I know the exact order of my data

       file.Read(buffer, 0, buffer.Length);

            string val = System.Text.ASCIIEncoding.ASCII.GetString(buffer);

            string[] row = val.Split(',');

edit using the profiler is not showing a difference, but I cannot simulate the exact real life scenario because the file size might get really huge. I am looking for theoratical information for now

4

0 回答 0