@Jesse If you look at the chart on the dashboard for my "Iowa State (Interlock House)" location, you can see that there are a lot of missing data. I just finished reviewing the CSV data produced by LoggerNet and there were only 3 missed 20 second records during the period when the new datalogger program was compiling. Therefore, I think something is still not working right on your system.
A clarification: In your email, you said that the datalogger (including record number) is reset when a new program is uploaded. This is only true if the program results in a structural change to a datalogger table. In this particular case, my program did not make any structural changes to any datalogger tables and, therefore, the datalogger tables were not reset and the record number did not reset to zero. There was a small timestamp gap (3 20-second records), but the record number sequence continued where it left off before the program update. This behavior is as expected, according to Campbell docs and my experience.
Finally, a comment and request: When I was originally considering whether to go with eagle.io, one of my biggest concerns was the lack of QA tools to inspect the data set (this isn't unusual, BTW...none of the tools I considered had good QA features). For example, there is no way to use the built-in features to detect missing data unless you build a chart and get lucky enough to notice the gap, as I did in this case. To address this shortcoming, I am planning on building my own tool that grabs the data using the API, inspects the data set for problems, and reports on the quality of the data set (number and location of missing/duplicate data, etc.). Current, I use a tedious spreadsheet approach to inspect the data.
If the eagle.io system was a bit more mature and "battle tested," I would be more inclined to trust that it is bug-free and I might not bother doing QA myself. However, as this and other bugs that I have discovered illustrates, the system is not fully mature. I'm also a little concerned that I seem to be only user posting to the forum...am I "stressing" the system much more than your other customers, or do they choose not to report bugs? I'm really curious about your other customers' use cases. Anyhow, I've decided to take a "trust but verify" approach, which is always a good idea, in my opinion. My request is that you consider adding some built-in QA tools so that the QA burden is not placed on your customers.
BTW, I think your tool is fantastic, which is why I chose it over the other available options in the marketplace. I have no regrets!