Spur AQI

The Colorado State Spur campus air quality project was a very unique and rewarding project I was hired to build for the Geospatial Centroid, partnered with CSU’s Chemistry department as well as the state’s clean air initiative.
For this project I led the creation of custom, always-on hardware for both the front and back end of Spur’s air quality sensor array, as well as creating the unique database system and software that serves the data 24/7 for climate research within the CSU-statewide system. This SpatialSQL database is integrated into the on-site ArcGIS enterprise system within the Morgan Library and features RAID backups so no climate data is ever lost.

The process for building the system started off with me pitching the use of standard 4U ATX PC chassis from Silverstone. I have had some experience building with them in the past, putting together audio workstations for my magnet high school, and remembered their chassis being incredibly robust and feature rich. Essentially a tower desktop set on its side, and with quick access front IO and server rack mounting hardware, it enabled us to utilize standard desktop components (including coolers). I had to build 4 total machines- 2 robust, redundant servers, and 2 ‘instrument’ machines, with substantial legacy support. At least 12 standard RS232/parallel per machine, to enable to the legacy interfaces of the sensor equipment to communicate to the instrument machines, which then would talk to the server machines that held the databases and were connected to the web. Plus, a load balancer running NGINX to handle proxy and traffic management, and to automatically switch between servers if one went down.
The server machines were a lot simpler but much more powerful. Same chassis, motherboards, and a more powerful Intel Core i7 12700 retail cpu, which had onboard graphics for on site diagnostics and setup. For these machines, they’d live at CSU Chemistry and hooked up to a UPS, so no Twins power supplies, just standard quality Seasonic Focus GX 80+ Golds. Same ram, and same SSD models, just smaller capacities, as they wouldn’t be backing up the raw data stream. These PCs also had 10GbE nics standard, so networking would be covered for our use case. Our project partners approved the parts, and within weeks we were building.
For the instrument machines, with the idea that they’d be running 24/7 for years at a time I started with a solid and proven platform, the Gigabyte Aorus Pro Z790 with DDR4 memory bus instead of the at the time, newer, and less proven DDR5. Anyways, memory speed wouldn’t matter for this application. It was much more important to prioritize stability. We went with a high-end consumer desktop board instead of a specialty server one from Supermicro or similar make purely because of budget constraints and the fact that these high end motherboards from Gigabyte have great I-O and support and compatibility. For the CPU I chose an Intel Core i5 12400. These machines main purpose would be to consistently read data off of these measurement instruments which isn’t too intensive, and we weren’t sure what operating system would be required for these older niche scientific instruments, and wanted to make sure we had a CPU that didn’t feature any specialty efficiency cores that would require manual wrangling, and the 12400 is one of the last purely P-core CPUs, so no matter the situation they would perform well. For power supplies we found these unique units that I’m still amazed exist to this day. They’re made by FSP, called “Twins” and are basically two hot-swappable server 1U power supplies in an ATX sized container. They automatically switch over if one fails improving reliability and redundancy. I can’t believe these units exist, you could put them in any old computer and get dual PSU functionality! They’re insanely expensive per watt, at around a direct 1:1 ratio, but the project partners saw my pitch for them and thought they’d be a great addition to these long running machines. A high-end Noctua cooler with one of their long life fans, 32 gigabytes of ECC DDR4 3200 and a 4TB of Micron server SSDs in RAID 1 to backup raw data streams from the instruments finished off the main platform. We then used 4 StarTech RS232 expansion cards, confirmed compatible with up-to date Linux kernels and Windows, one in each expansion bay and a breakout port to utilize the motherboard’s built in COMM port to get us up to 12 ports.


