The Brilliant Boy Billionaire

The Amazing Journey of a Remarkable Kid, by Altimexis

Posted June 30, 2021

PART THREE — The Fun House

Chapter 3: Bring on the Robots

“J.J., this is Clarence,” Mr. Winters said in introduction, “and Clarence, this is J.J.” It was Tuesday morning, and after spending all of Monday going through mandatory employee orientation that could’ve and should’ve been done in the comfort of my home, I was finally being introduced to the guy who would show me the ropes. When I shook Clarence’s hand, however, he had the limpest handshake of any I’d ever experienced. My Dad would’ve crushed the poor guy’s hand if they’d ever met.

“Hello J.J., it’s nice to have you with us.” Clarence responded in voice so quiet I could barely hear it. I’d read stories involving effeminate boys on the internet, and I was familiar with the portrayals of gay characters in the past by Hollywood, but I’d never encountered anyone so effeminate in my life. Talk about stereotypes!

“J.J. has his data center tech certification,” Mr. Winters continued. “However, this is his first job. He’s only sixteen, but in addition to being a certified data-center tech, he has certifications in web development.” Clarence seemed shocked. “He’s a very fast learner, and we do plan to use his skills in web development, too, once he’s up to speed. You’re still our main man, Clarence. I think J.J.’s headed for a position beyond Omaha.” Clarence visibly relaxed at that. He musta actually thought I was after his job. That was a laugh. However, I realized I was probably already making more than he was, so I’d be careful not to discuss salary with him.

“So, I’m going to head back now,” Mr. Winters continued. “Feel free to contact me if either of you have questions or issues. J.J., I’ll meet with you periodically, and we’ll talk about taking on some website-development tasks.”

After Mr. Winters had left, Clarence turned to me and asked, “You’re really only sixteen?”

“As of January,” I replied. “It probably doesn’t help that I look more like I’m fourteen.”

“Believe me, I know what that’s like,” Clarence responded. “I’m twenty-seven, but everyone thinks I’m more like seventeen.” Truthfully, I really had thought he was closer to my age.

“It would help if my damn voice would change and if I’d get some hair other than on my head,” I replied with a laugh. “You can imagine what it was like in high school, especially since I started my freshman year when I was only eleven.”

“Good god, you really are a genius,” he responded. “High school must’ve been awful for you.”

“I spent as much time as I could in the library,” I replied, “which is one of the reasons I’m so far ahead. I had good reason to spend all my time in the library, even after school let out every day. Home wasn’t too good, either.”

“I grew up in a Mormon household,” Clarence responded. “You can imagine what life was like for me. I had to get married — to a girl of course. It was all arranged for me. I’d like to say I’m one of those straight effeminate guys you hear are supposed to exist, but I’m not. I hope it doesn’t freak you out to work with a gay guy…”

“Hardly,” I replied. “I’m gay. It’s the reason I had to leave home when I was fifteen. It’s the reason I had to live on the street, but I got lucky compared to most. I got a job painting houses with a family that gave me a roof over my head, and that ultimately led me here.”

“I can’t believe you’re gay,” Clarence responded, “and you’re out?”

“No reason to hide it,” I replied. “I’ll never find a boyfriend, hiding in the closet.”

“There’s no closet deep enough for guys like me,” Clarence continued, “and Nebraska isn’t exactly gay-friendly, especially in the small town where I grew up.”

“Probably not much better than where I grew up in Southern Indiana,” I pointed out.

“I don’t have to imagine,” Clarence replied. “I lived it. Unfortunately, that left me married and with two kids. I’ve never done anything with a guy, and I probably never will have the opportunity now. I do love my wife, even though our relationship is essentially platonic, and I wouldn’t want to imagine life without my kids. I’m not unhappy — just frustrated.”

“I’m sure your wife must be frustrated, too,” I suggested. “Have you ever considered having an open relationship where you’d each be free to pursue other relationships on the side?”

“We were raised Mormon, and although we’re no longer active participants in the Church, we could never do anything outside of marriage,” Clarence explained. “Besides which, I wouldn’t want to take a chance on hurting the kids. Maybe after they’re grown and in college, but certainly not before then.”

Then turning toward me, he suddenly got a look of horror on his face and asked, “You weren’t implying that we should do something together, were you?”

“Clarence, you’re eleven years older than I am and you’re my boss, I think,” I replied. “It wouldn’t be appropriate for us do anything together. I’m way too young, anyway.”

“Sixteen is the age of consent in Nebraska,” Clarence pointed out.

“No offense, Clarence, but you’re not my type,” I replied. “Some guys are into effeminate guys or even guys in drag, but that’s just not me.”

“I totally understand,” Clarence responded. “I wouldn’t date someone like me, either. Let’s get down to looking at how the data center’s organized.”

Not surprisingly, Applazon’s servers were based on Linux, a variant of AT&T’s UNIX operating system first developed in the 1960s and known for its stability. In the early days of the internet, most servers ran on UNIX because everything else tended to crash. That began to change in 1995, when Microsoft introduced Windows 95, and in 2000, when Applazon introduced ApplazonOS, which also ran on top of a UNIX kernel. A lot of low-volume, small business enterprises that used Applazon personal computers adopted AplazonOS to provide a stable, secure, seamless interface to the web, but that was a niche market. It was ironic that Applazon eschewed its own server in favor of Linux for its data centers, but then ApplazonOS wasn’t designed to run on vast networks of computers.

Most enterprise servers used Windows because they already relied on the Microsoft Windows operating system to power their desktop computers and workstations. Windows offered a polished product that readily adapted to site-specific modifications. However, it was full of vulnerabilities that required software patches once hackers discovered them. On top of that, Microsoft was always tinkering with Windows, and with each new release came new vulnerabilities. Windows was proprietary and had to be licensed from Microsoft at significant expense. I never really understood why any web enterprise chose Windows, yet a majority did. The only real advantage to information-technology professionals was job security, which probably explained Windows’ popularity.

Corporate executives that couldn’t be bothered with the details of website management usually left such decisions to their I.T. directors, who often had their own agendas. Jeff Barlow knew better than to trust anyone to apply technology that affected the bottom line unless they also had a substantial stake in the bottom line. Andy Jenkins, the current head of Applazon Cloud Resources, was one of Applazon’s first hires, and he remained one of the corporation’s major stockholders. Needless to say, he was quite wealthy in his own right.

Applazon required a nimble, secure, stable and inexpensive operating system to power its servers, so naturally they’d chosen Linux, which was an open-source version of UNIX. Open source meant that the source code was available at no cost and could be modified as needed, provided that the modifications were also made freely available to the public. Linux lacked the tools for server management and had no built-in web server, and for that Applazon chose Debian, which dated back to the mid-90s, ran on top of Linux, was open source and offered unparalleled stability and security. However, rather than relying on an open-source web server such as Apache, Applazon developed its own unique front end to Debian that was better suited to the massive data management required for ACR. Whereas Linux and Debian were open source, ACR software definitely was not. I was intimately familiar with nearly every server variant out there, thanks to years of studying them while in school, but learning the ins and outs of ACR was going to be totally new to me.

Likewise, Applazon’s hardware was completely custom-made, avoiding the vagaries of hardware commodity markets and shortages of components that plagued other installations. Located in Cupertino, California, in the heart of Silicon Valley, Applazon computer scientists and solid-state physicists designed their own switches and interconnects, making use of a wonky 25 Gbs ethernet standard that allowed for significantly higher throughput on optical fiber cables than the gigabit, 10 Gbs and even 40 Gbs wired systems used in the consumer products they designed. Even the SSD controllers were custom-designed.

Although Applazon designed all their hardware in house, they contracted with several manufacturers so as not to be dependent on any one company or supply chain. Applazon’s servers were far more fault tolerant than most, but then Applazon had a network the likes of which no one else could ever hope to match. With data centers located all over the world, data were multiply redundant, both within facilities and between facilities.

The placement of one of the world’s largest data centers in Omaha was not by accident. Whereas the West Coast was constantly at risk of fires and earthquakes, and whereas the East Coast and Gulf Coast were constantly under the threat of hurricanes, our location was among the safest in the continental U.S. Flooding from the slow-moving and shallow Platte River and from the mighty Missouri was always a risk as we’d just witnessed, but the permanent data center would be built on high ground, elevated well above any potential flooding. Tornadoes and severe storms were also a possibility, but more of a risk to the wind farm than to the data center itself, which would be in a hardened bunker. In any case, we had emergency generators that could supply power for weeks on end. Nothing short of global thermonuclear war or a giant asteroid hitting the earth could take us out.

Not only were we in the center of the continent, but we were literally on the edge of civilization. Just as the East Coast and West Coast servers connected to data centers in Europe, Africa and Asia, Omaha was on the edge of a vast region of North America in which there were no major cities for hundreds of miles. If one were to draw a series of lines connecting Omaha, Kansas City, Wichita, Tulsa, Oklahoma City, Fort Worth, Austin, San Antonio, El Paso, Albuquerque, Colorado Springs, Denver, Salt Lake City, Boise, Spokane, Calgary, Saskatoon, Regina, Winnipeg and Minneapolis, the enclosed area was virtually devoid of population areas of more than a quarter-million people. Not that there weren’t people living there, but from a data standpoint, it was a vast inland sea, and we were on the coast, providing a connection from points east to points west.

Clarence spent virtually the entire morning explaining the server architecture to me and explaining the monitoring software used to keep track of data flow and server performance. From any console, we could pull up individual website statistics, server performance, diagnostics and estimated percent of remaining component lifespan, among other information. The interface was entirely web-based, which made it available from any workstation. It also meant that I had the background knowledge to delve into the code and potentially to modify it. I expected that that could come in handy someday.

The main console also showed a list of maintenance tasks that were needed and a schedule for performing them. Every day, there was a generated list of servers to be replaced, and we were responsible for taking each one out of service, removing and replacing it with a new or refurbished server, uploading data into the server’s SSD and placing the new server back in service. We were also responsible for refurbishing each server, repairing or replacing components that were at end of life or that were not performing up to specs, and troubleshooting problems that internal diagnostics couldn’t resolve. Once the repairs were completed, the server would be placed in a queue of units available to be placed back in service. All of this was extremely labor-intensive, which explained why we needed such a large staff of well-paid, highly skilled technicians to perform it.

Right away, the rational part of my brain wondered why the whole process wasn’t more automated. We had robots back in the delivery station that retrieved thousands of items of various sizes and shapes every day, sorting and preparing them for delivery. Why couldn’t we have an automated system for replacing servers. If the racks were cylindrical instead of square, with the servers arranged vertically instead of horizontally and with all the interconnects within a backplane, a single robotic arm could access every server, removing and replacing it without human intervention. Technicians would always be needed for troubleshooting and for servicing the hardware, but automation could increase capacity without needing to hire more techs. Was there a reason Applazon hadn’t done this already? Was it a matter of cost, or was it simply that they’d grown so rapidly that there wasn’t time to design something that didn’t have an immediate return?

Clarence seemed to be astounded by the speed at which I learned how to use the software, but to me it was just the same as learning anything else. I only needed to be shown something once, and I often figured things out on my own. Unless software was designed poorly, it usually presented a logical interface to the user. Applazon’s software was among the most intuitive I’d ever encountered, which was not to say there wasn’t room for improvement. Already I was developing ideas for better ways to automate tasks and to present the data more concisely.

When the noon hour arrived, I discovered that lunch was definitely not available onsite. There were vending machines with a variety of snack items and a handful of sandwiches that made my old high school’s cafeteria food look like gourmet. There were no restaurants, food carts or takeout spots anywhere nearby and certainly not within walking distance. Checking my phone, there was a Walmart Supercenter across the Interstate, literally opposite where Applazon was building the new data center, but there was no way I could walk there in time. It seemed the only place that delivered was Arby’s, but they had a minimum charge. Clarence had brought his lunch, as had everyone else, but no one objected when I offered to send out for Arby’s, my treat. From now on I’d have to brown-bag it.

After lunch, Clarence showed me the specifics of how to remove and replace servers within the racks. I was shocked by how inefficient the procedure really was, as we had to shut down the server from the console in the front of the room, physically unbolt it and remove it from the rack, disconnect several power cables, a fiberoptic Ethernet cable and a series of miscellaneous interconnect cables, connect those same cables to the new server and then slide it in place and bolt it into the rack. We then had to go back to the console in front and run a full diagnostic on the new server to confirm that it met specs before uploading data to it and then bringing it online. The process of replacing the hardware only took about fifteen minutes, if there weren’t any problems encountered, but the procedure of actually bringing the server online took over an hour.

Applazon made use of custom-designed ARM processors and interface chipsets but relied on standard off-the-shelf RAM and NVMe SSD modules. The architecture was standardized across Applazon sites and could not be modified. The servers in which they were used were modular, making it easier to replace them as integral units that could be maintained, troubleshot or refurbished offline. Thanks to aggressive data management, Applazon had managed to increase SSD reliability and longevity by a factor of at least three over those used in consumer devices. I discovered that just by looking at the statistics recorded from this particular data center. However, a small percentage of new SSDs failed early and in a facility with 12,000 servers, early SSD failure accounted for a significant percentage of server replacements. The average time to replacement for a server was only two years, but that was primarily because Applazon had a policy of replacing all servers after three years of service, regardless of failure, due to technologic obsolescence of the processors and other components. That amounted to replacing 18.5 servers per day.

Of course, the time required to replace each server was only part of the equation. The data-center technicians were also responsible for refurbishing the servers before they could be placed back in service. About half were outright scrapped and replaced, but even the scrapped units had to be decommissioned, which meant removing all components that could still be used in refurbishing other servers, performing a military-grade data wipe of SSDs that were at end of life, and sending what was left to another facility for recycling. The physical labor required to scrap a server was actually the same as that to refurbish a server — about a half hour. However, about ten percent of the servers removed had problems that couldn’t be resolved with the built-in diagnostic software. Those we were allowed to spend two hours on, troubleshooting and repairing the unit. If it took any longer than that, the unit was scrapped. That added up to twelve hours of labor, or 2¼ full-time positions, after accounting for weekends and paid time off. Multiply that by a factor of six when the new data center, currently under construction, finally opened. I calculated roughly a million dollars per year, just for the cost of the physical labor to replace servers.

As we approached the end of the day, I asked, “Clarence, do you know when the new data center is supposed to be ready?”

“They’ve just started clearing the land,” he replied. “It’s going to be a state-of-the-art building with all concrete construction to protect against the weather extremes that are expected to become increasingly common with climate change. I doubt it’ll open any time before 2021.”

“Do you know if the hardware layout is already, no pun intended, set in stone?”

“Corporate holds that sort of thing very close to their collective chests,” Clarence answered. “I wouldn’t be surprised if they plan to introduce a new architecture. Whether or not they’ve actually committed to a specific design is anyone’s guess, but you can be sure they wouldn’t have started construction unless they had a good idea of the physical space requirements.”

Just then, Rob arrived, and as he was my ride home, I had to leave. On the drive home, Rob asked, “So, how was your first day working for Applazon? You don’t look like you’re walking funny the way most stock boys do after their first day. What exactly is in that building where they have you working anyway?”

“I’m not at liberty to say,” I answered honestly.

“You spent the last two weeks on that fancy Applazon ProBook they gave you, and every time I saw you, you were working on what looked like computer graphics and what I think was computer code. Applazon is the largest provider of web services in the world, so it doesn’t take a genius to figure out there must be a data center in that building, and they’re training you to do something more than scutwork. They’re not paying you $70k a year to do what I’m doing.”

“I still think I should be contributing something toward room and board, until I move out, that is,” I replied.

“We’re all in agreement with Dad there, J.J.,” Rob responded. “What you’ve brought to the family is priceless. We all want you to stay as long as you want to. Actually, I wish you’d never leave. Save your money, maybe buy a car and get yourself a college education. Someday, you’ll undoubtedly move to either coast, but in the meantime, you have family in Omaha.” I couldn’t help the tears that came to my eyes and ran down my face.

Dinner was already on the table when we arrived home, and so Rob and I quickly washed up in the downstairs bathroom and headed upstairs to the great-room table. Everything smelled wonderful, especially after having eaten fast food for lunch. We had meatloaf with mashed potatoes and gravy, green beans and apple pie with cinnamon ice cream for dessert. It was a simple home-cooked meal, but excellent.

“Listen, I need to bring my lunch to work,” I mentioned to Fran as we ate. “There’s absolutely nothing around there within walking distance, and Arby’s is the only place nearby that delivers. Can we get some lunch meats, pudding cups and the like that I can take lunch with me in the morning?”

“You’ll do no such thing,” Fran countered. “I’ll send you off with a proper lunch.”

“Why don’t I just bring you something?” Rob suggested. “I have to stop by around lunchtime anyway to pick up a load, and I have a generous lunch allowance that I never end up spending, so why don’t I just grab double portions of whatever I eat and drop off a care package for you every day?”

“Won’t you get in trouble for that?” I asked.

“Hardly,” Rob answered. “A lot of the other drivers use their allowance to buy stuff to take home with them. As long as my receipts don’t exceed my allowance, no one will look twice at them.”

“But it’s out of your way,” I countered.

“By less than a mile,” Rob pointed out, “and it’ll be a chance to check up on my new brother every day.” How did I get to be so lucky to find my way to this family?

After finishing helping to clean up after what I still called supper and everyone else called dinner, I grabbed my laptop and sat in a corner of the great room, thinking about what I’d seen that day. Other members of the family were around me and the television was on with some program blaring, but I was oblivious. I’d never had the luxury of working in a quiet setting, other than in the school library, and so I barely even noticed the noise.

Pulling up a CAD-CAM program I found on the company site — that’s computer-aided design, computer-aided manufacture — I quickly figured out the graphical interface. It was a very powerful piece of software with many layers of complexity, but the intuitive interface made it easy to understand. All of the custom-server components that were in use in the data center were already in the program, making it easy to utilize them in my own designs. There were a number of components that I hadn’t seen at the data center, however, that I felt should be an integral part of a state-of-the-art facility, such as water-cooling modules.

I started out redesigning the servers themselves, making them more compact and amenable to automated robotic handling. By using off-board water cooling, the servers could be stacked much more closely, allowing for the radial rack layout I had in mind. Typically, water-cooled computer modules were coupled to heat-generating components using thermal paste, but that would’ve required complex connections for the coolant that brought with them the potential for leaking fluid. I tried simulating a number of different arrangements, but all of them would’ve resulted in runaway thermal stress and damage. I couldn’t exactly submerge the servers in water, but I could submerge them in oil. Hardcore gamers were known to overclock their computers and then cool them with everything from submersion in oil to liquid nitrogen.

I ruled out oil cooling right way, as I imagined trying to repair a module that had been submerged in oil. I read up on liquid nitrogen and discovered that although it’s inert, it becomes explosive in the presence of organic matter and oxygen. Plastics are organic, and they’re used everywhere as insulation in electronics. It might be possible to redesign the electronics to be plastics-free, but I couldn’t redesign the humans who’d service the servers, so liquid nitrogen at first didn’t seem practical either.

I spent a good part of the evening looking up information on computer-cooling mechanisms, which really amounted to two problems — removal of excess heat and reduction of thermal stress. Most solutions addressed the former while addressing the latter by happenstance and otherwise ignoring it. I came up with some interesting thoughts such as using the excess heat to power a refrigeration system that would cool the servers, but that was a major research project worthy of a Ph.D. dissertation in and of itself. That was something to consider in the future. I needed something that would be practical to implement now.

I came back to nitrogen. As a gas it could be as effective a coolant as air, which was nearly 80% nitrogen, but without the issue of condensation. Liquid nitrogen was very easy to generate and when boiled, generated gaseous nitrogen at about 200°C below zero. If I could drop the temperature on the chip that much, the band gap would shrink by about three-quarters and as a result, the switching voltage. Since power is proportional to the square of the voltage, power consumption would drop by 94%. Even if I couldn’t get the temperature on the chip that low and only shrink the band gap by half, the power usage would drop by three-quarters, which was huge.

Dropping power consumption by 75% would drop waste energy by as much, making it that much easier to cool the server components. However, there was still the issue of liquid nitrogen reacting with oxygen and organic matter. The nitrogen would be entirely gaseous by the time it reached the electronics, but would it displace enough oxygen to prevent a chemical reaction? Based on my calculations, it would in a matter of seconds, so long as we enclosed the servers. However, even with very slow escape, it could eventually drop the oxygen level in the entire server room, asphyxiating the techs inside. That we’d need to monitor oxygen levels was a given, but it might be necessary to issue oxygen masks and monitor blood-oxygen levels when the techs worked on the servers. However, with such huge reduction in power consumption, the cost would be well worth it.

Going back the CAD-CAM program, I arranged all of the server components, including the CPU, drivers, RAM, SSD and the optical Ethernet interface chips on a single compact circuit board. Not only would that allow for simpler, more efficient nitrogen cooling, but it would allow for a much more densely packed arrangement of servers within the server cabinet. The downside was that replacement of individual components would be more difficult if not impossible. However, nitrogen cooling should cut component failure rates by at least 90% and extend SSD life by a factor of four. Early SSD failure might still be an issue, but I strongly suspected I could identify SSDs likely to fail before they were installed. I had some ideas for modeling, but that was another project for another time. Otherwise, the improved longevity would shift the economy of operation from one of repairing servers to one of replacement and recycling.

I designed the server racks as stackable toroidal cabinets, each of which could hold several times as many servers in a vertical orientation as a conventional square cabinet, but in a footprint only as large as all of them combined. However, the number of racks that could be stacked on top of each other was limited only by the available vertical space, with six racks fitting within the ceiling height of a typical server room, making allowances for headroom for infrastructure and servicing the individual racks within each stack. Even with allowances for space around each stack, I could fit more than triple the number of servers in the same physical space of conventional rack systems.

The most important part of the design, however, was the automated robotic system for server replacement. I’d envisioned a hollow center in the server racks, with a central robotic arm that would access all of the servers arrayed within a stack, completely automating the process of server replacement. A separate set of tracks, probably mounted on the ceiling, would connect the robotic mechanisms to a supply of servers from a central location. On the other hand, the increased longevity afforded by nitrogen cooling should make it feasible to cache a supply of replacements inside each stack. The stacks could then effectively be hermetically sealed, minimizing nitrogen leakage and maximizing cooling. The server cache would then be replenished by hand on an infrequent basis.

Once I was satisfied with the overall design implementation as rendered by the CAD-CAM software, I saved it to my personal corporate cloud space and then considered what to do with it. I reasoned that if I just sent the entire design to my superiors, if it even went anywhere at all, whatever resulted from my designs would ultimately be attributed to someone else. Therefore, I generated a series of conceptual drawings and sent them to Mr. Winters with a note to the effect that I had complete manufacturing designs for everything, ready to go.

Just as I was about to log off the site, close up my laptop and go to bed to perhaps enjoy another night of experimentation with Sammy, I had a strange thought. Could someone get into my account and steal my design? My account was password-protected, but the system administrator could override such meager security in a nanosecond. Not that I thought anything I was doing was worthy of such attention, but I’d read stories about corporate espionage and how brutal the competition could be among colleagues. As a precaution, I decided to encrypt all of the associated files with my own 1024-bit password.

The author gratefully acknowledges the invaluable assistance of David of Hope and vwl-rec in editing my stories, as well as Awesome Dude and Gay Authors for hosting them. © Altimexis 2021