The New Superheroes of SuperComputers

Auburn‘s supercomputer user base and demands bring need for third system expansion

Font Size

Article body

New discoveries often spawn new questions, which in turn increase the complexities and challenges for modern researchers. To tackle the size and intricacy of these new problems, Auburn is empowering faculty and students with state-of-the-art supercomputers from which they can observe new findings that are otherwise impossible to see.

In previous Auburn research efforts, scientific computing was treated as an ad-hoc effort, with each researcher or lab often maintaining and administering their own purpose-built machines as needed. This practice, however, carries with it the implicit overhead of making sure these complex systems are functioning properly and efficiently and are managed securely; ultimately, it takes time and effort away from the lab’s primary focus. In an effort to reduce these complexities, and support the research community, Auburn’s Office of Information Technology (OIT) has recently made breakthroughs in the university’s centrally maintained high performance computational offerings and research computing support.

Auburn University unveiled its first centrally operated supercomputing system as a research enhancement tool and service in 2013. Now, in just six years’ time, OIT is preparing to launch its third generation of this high-performance Goliath.

“We started with the CASIC machine—located in Auburn’s Research Park, in the building with the same name,” said Bradley Morgan, OIT infrastructure architect. “At that time, we had fewer than 50 researchers using the supercomputer.”

Use grew rapidly, however, as researchers learned of CASIC’s capabilities and the corresponding support provided by OIT’s team of high performance computing (HPC) specialists.  Within less than three years, the CASIC system had seen a major increase in demand, and it was complemented by a new $1 million supercomputer —HOPPER—named in honor of the late Rear Admiral Grace Hopper, who was a pioneer of computing technology.  HOPPER is housed in Auburn’s primary data center, which is located in the lower level of the OIT building on the south side of campus.

“HOPPER, along with our team of system administrators, now supports over 500 users working on a variety of projects across the majority of colleges and schools,” Morgan added.

One such user, Dr. Evangelos Miliordos, speaks to the importance of using high performance computing to create better balance in his workload as an assistant professor. His innovative research with quantum chemistry has been underway since he joined Auburn in 2016, and its goal is to “disclose new chemical systems which can facilitate the conversion of inert hydrocarbons, such as methane, to functional platform chemicals, which can further be processed to make goods we use on a daily basis.” Since joining the university, he had to find a balance among teaching, advising and researching. In addition to the computational power, Miliordos shared an appreciation for “researchers like me who have the time and energy to put their efforts in doing research and being productive. I am certainly grateful to HPC, and the human power behind the HPC initiative, for the most productive years of my career so far.” He goes on to share his belief that “this facility is a great asset for our university, and I had no hesitation in investing and participating in the HPC initiative.”

In response to such growing interest in high performance computing, Auburn has plans to build a third supercomputing system by 2020, Morgan said.

This new supercomputer cluster will make available to Auburn researchers the next generation of high performance computing nodes—a series of individual fast-operating computers tied together into a cohesive system that computes with lightning speed. Auburn’s third generation system will have nodes with up to four times the processing power of HOPPER, which is already powerful with more than 16 terabytes of memory and 1.4 petabytes of disk space. To put this into context: consider the fact that a Blu-ray disc, which can hold three hours’ worth of high definition movie and bonus material, is 50 gigabytes. It takes 1,000 gigabytes to equal a terabyte, and it takes 1,000 terabytes to equal a petabyte. That means 1.4 petabytes could store 28,000 Blu-ray discs!

To effectively process the large data sets that reside within this storage capacity, HOPPER and CASIC can crank out a combined 270 trillion FLOPs, or floating point operations, per second. A FLOP is basically a mathematical operation involving small or large real numbers. To match what happens in one second on these machines, a human would need to perform one calculation every second for over eight million years, or, to match the upcoming system, over 30 million years.

Faculty and graduate students in the School of Forestry and Wildlife Sciences share the importance of that added speed in their experience with HPC for their land climate interaction research. Sathish Akula, a graduate student working on this project, said that “our daily work involves analyzing multiple data sets of many years together. Doing this much processing on a personal computer would not be time efficient, whereas using HPC allows us to run the same script using multiple processors at the same time, which speeds up the processing.” With time sensitive research such as drought predictions and both short-and long-term climate shifts, waiting a long time for results isn’t an option.

Akula goes on to say that their advancement in these sciences is due, in large part, to the processing power of HPC.

While Auburn’s supercomputing efforts have only been around since 2013, it’s worth noting that supercomputers have been in existence since the 1960s. Since that time, they have played an important role in the field of computational science, and they are still used for a wide range of computationally intensive tasks in various fields, including quantum mechanics, weather forecasting, climate research, oil and gas exploration, molecular modeling and complex physical simulations. In short, supercomputers exist to help solve extremely complex and difficult problems. They require the use of hundreds, or even thousands, of computer processors, vast amounts of memory, and parallel algorithms and software all working in concert to arrive at a solution.

“Our supercomputers currently have more than 700 software packages installed to support researcher needs,” added Matt Smith, a system administrator in the OIT HPC team.

We provide software installation support for researchers, which can often be a major hurdle for them. Instead of spending hours chasing down compilation or installation problems, we take care of it for them, allowing them to focus more directly on their subject matter.”

Principle investigators purchase their time and space on the supercomputer based on their computational needs. To provide efficient use of the machine, work is scheduled and managed through the system’s workload manager, or scheduler. The scheduler keeps track of utilization in real time and allocates resources using a customized algorithm to determine the best allocation based on input provided by the researcher. OIT configures and maintains the scheduler, as well as the operating systems, software packages, cabling, hardware and anything else needed to run the system effectively. To help researchers in using the system, OIT also provides user training and technical support.

David Young, a researcher in the Department of Chemical Engineering, is grateful for the asset that is the HPC team. His project, which focuses on the development of screening strategies in the context of current technologies, has been underway since December 2016. To test a strategy, Young must perform a combination of simulations that could take hundreds of hours on a single computer. He knows HPC has significantly sped up his data collection process, but “without the help of the HPC admins, I think I would have had a much tougher time," Young said.

Currently, the colleges of engineering and sciences and mathematics account for the most users of the supercomputer at Auburn. Researchers in agriculture, forestry, veterinary medicine, pharmacy, liberal arts and education also use the system in the conduct of research. The supercomputing efforts at Auburn are still relatively young, but OIT is excited to facilitate growth as the technological needs of researchers continue to change and increase.

Auburn University is a nationally ranked land grant institution recognized for its commitment to world-class scholarship, interdisciplinary research with an elite, top-tier Carnegie R1 classification and an undergraduate education experience second to none. Auburn is home to more than 30,000 students, and its faculty and research partners collaborate to develop and deliver meaningful scholarship, science and technology-based advancements that meet pressing regional, national and global needs. Auburn's commitment to active student engagement, professional success and public/private partnership drives a growing reputation for outreach and extension that delivers broad economic, health and societal impact. Auburn's mission to educate, discover and collaborate drives its expanding impact on the world.