kevlar

Accidental Invention: Kevlar

While often associated with high-stakes industries like defense and aerospace, Kevlar has been used in a variety of products, including tires, gloves, sports equipment, and more. The fibrous material that began as an unexpected discovery in a laboratory has gone on to save lives, transform industries, and inspire engineers around the world.


Kevlar, a super-strong synthetic fiber, is a key component in everything from bulletproof vests to racing sails. Kevlar’s durability and versatility make it a go-to material for improving product performance and safety. The story of Kevlar is a compelling blend of scientific curiosity, accidental discovery, and valuable innovation.

An Unexpected Solution

In the mid-20th century, chemist Stephanie Kwolek was employed by the DuPont Company to work on projects involving polymers and low-temperature condensation processes. At one point, she was tasked with finding a new variety of lightweight, durable, and heat-resistant fibers to replace existing steel wires in car tires. During her research, she worked with synthetic polymers (or polyamides), dissolving them in solvents and then running the solution through a machine that would spin it into fiber.

One day in 1965, during her experimentation, Stephanie got an unexpected result. Instead of the typical thick, transparent polymer solutions she had grown accustomed to, this new solution was cloudy and watery. She then spun the mysterious substance and consequently created one of the strongest fibers the world had ever seen.

A Fiber of Many Uses

Following the discovery of Poly(p-Phenylene) Terephthalamide and its valuable traits, DuPont began commercially producing the product under the name Kevlar. While it became a component in radial tires as originally intended, the material also found its way into numerous other applications. Due to the fibers’ impressive toughness (up to five times stronger per weight than steel), durability, and heat-resistant qualities, Kevlar proved useful in other industries, such as defense, aviation, and construction.

Today, one of Kevlar’s most common applications is in protective gear. With its incredible tensile strength, heat tolerance, and resistance to penetration, it has been used to make bulletproof vests, work gloves, and firefighter suits. Kevlar’s remarkable qualities are also used in racecar tires & brake pads, parts for aircrafts, spacecrafts, and boats, and sports equipment like medicine balls, mountaineering ropes, and tennis racquets.

Weaving the Future

Over the years, scientists and engineers have pushed the boundaries of Kevlar’s capabilities. Different formulations and treatments have been developed to enhance its resistance to chemicals, flames, and abrasion, making it suitable for an even wider range of applications. DuPont continues its commitment to invest in constant quality improvements.

As we look to the future, Kevlar’s potential seems limitless. Researchers are exploring ways to integrate the fibers into wearable technology, medical devices, and even lithium-sulfur batteries

Kevlar’s journey from a serendipitous laboratory discovery to a global engineering staple is nothing short of remarkable. With ongoing research and development, the future of Kevlar holds promise for even more groundbreaking applications thanks to its impressive durability and versatility.

If you enjoyed this accidental invention story, you may be interested in reading about Safety Glass, Super Glue, and Silly Putty.


To learn more about Custom Powder Systems and the art of engineering, sign up for our newsletter.

Accidental Invention: Saccharin

Accidental Invention: Saccharin

If you’ve ever looked through a collection of sweeteners for your coffee or tea, you’ve probably found several packets in various colors. But do you know what is actually inside? The pink one contains saccharin – a zero-calorie sugar alternative that was a surprisingly serendipitous result of laboratory experimentation 150 years ago.


A substance that is 200 to 700 times sweeter than sugar, saccharin was the first-ever sugar substitute and has been commonly used as a calorie-free beverage additive for decades. Though there has been much debate about its potential health impacts, saccharin has continued to be one of the most popular artificial sweeteners on the market.

A Sweet Surprise

In 1876, American chemist Ira Remsen returned from Munich and Gottingen University in Germany where he studied sulfobenzoic acids and subsequently established the first chemistry department at the newly-established John Hopkins University in Baltimore. A year later, the firm of William H. Perot & Co. hired German chemist Constantin Fahlberg to conduct an advisory examination of a shipment of demerara sugar that was alleged to have been artificially darkened to avoid higher import taxes, with arrangements for him to work in Remsen’s laboratory. After completing his analysis, Fahlberg received permission to stay at the university, and began researching coal tar derivatives alongside Remsen and his team.

One day, after returning home from the lab, Fahlberg noticed the piece of bread he was eating tasted exceptionally sweet. However, he quickly realized that it was not actually the food that was sweet, but his hands. So, he returned to the lab and tasted all of his beakers, glasses, and bowls until he determined which substance was the source – an oxidation of ortho-toluenesulfonamide that created benzoic sulfimide, which he called saccharin, meaning “of or resembling sugar.”

Sweeter than Sugar

Over the next few years, Fahlberg and Remsen co-authored an article describing the synthesis of saccharin, highlighting that the compound was miraculously “sweeter than cane sugar.” Though Remsen did not like the concept of industrial chemistry, Fehlberg recognized the commercial potential of his discovery and applied for both German and American patents to cheaply produce the substance in larger quantities. Despite Remsen’s protests, Fahlberg was awarded a US patent in 1886 and began manufacturing saccharin in pill and powder forms, marketing them as beverage additives.

Before too long, however, competitors and consumers began having concerns about consuming saccharin and, with the release of Upton Sinclair’s The Jungle in 1906, Americans began to demand action in response to food-industry horror stories. As a response, Harvey Wiley, head chemist of the US Department of Agriculture, proposed the first saccharin ban, believing that it could not possibly be safe. This was quickly shot down by President Theodore Roosevelt, who was given a prescription from his primary physician for the substance as a weight-loss strategy, stating “Anyone who says saccharin is injurious to health is an idiot. Dr. Rixey gives it to me every day.”

Sickly Sweet

Eventually, in 1912, saccharin was banned for use in food manufacturing, but it was still available as a standalone product, continuing to be a desirable “non-fattening” alternative for diabetics and those looking to cut calories. Soon, it became an even more popular substitute due to sugar shortages during the World Wars. In the 1960s, saccharin gained even more traction as American interest in weight loss continued to grow, and the recognizable brand Sweet’n Low was created.

Several studies conducted on rats in the following years suggested a link between saccharin and bladder cancer, leading to the Saccharin Study and Labeling Act of 1977, which required products containing the ingredient to have a warning label stating it may be hazardous to the health of consumers. However, later findings concluded that those results were irrelevant because humans metabolize the chemical differently, and it was removed from the potential carcinogens list in 2000, rescinding the packaging regulations.

Though there has been continued controversy about artificial sweeteners, the discovery of saccharin opened the doors for new innovations that have provided numerous alternative choices for individuals looking for low-calorie sugar substitutes. As consumer preferences continue to trend towards “lighter” and healthier options, it’s likely that artificial sweeteners will remain significant in the food and beverage industry for years to come.

If you enjoyed this tale of accidental innovation, check out similar stories about Corn Flakes, potato chips, and penicillin.


To learn more about Custom Powder Systems and the art of engineering, sign up for our newsletter.

Screenshot 2024-03-25 at 9.27.57 AM

Inventions Ahead of Their Time: French Fries

Is there anything better than biting into a fresh, salty, crispy french fry? Whether served alongside a burger or dipped into your milkshake, there’s no denying their deep-fried deliciousness. But where did this iconic side dish come from, and how did it become the beloved treat we know today? The original recipe is probably older than you think…


When you think of making french fries, you probably envision thin potato strips cooking in a restaurant’s industrial-sized fryer filled with bubbling hot oil. While this iconic side-dish staple is most often associated with fast food and “American-style” restaurants, french fries were not originally created in the US. And, although they seem like a modern food, the first french fries were likely invented over 250 years ago.

Belgian Fries?

The history of french fries is a bit of a culinary conundrum. While the name suggests a French origin, some food historians argue that the story begins in Belgium. One theory involves potatoes that were brought to Europe from Peru by Spanish forces in the late 17th century. At the time, Spain controlled what is now known as Belgium, so the citizens of the area were among the first to be introduced to the vegetable.

The nearby River Meuse served as an abundant source of fish, and locals would fry small ones to go along with their meals. However, when the river froze over in the winter, they began frying thin strips of potatoes instead. As the story goes, the villagers fed these fried potato sticks to soldiers while France was at war, and soon after, they became popular around the world.

However, some historians doubt the plausibility of this theory. They suggest that potatoes were not introduced to that specific area until decades later. Plus, at the time, oil and fat were too expensive and difficult to find in large quantities for frying food.

Paris Fries?

As you might guess, the French also stake a claim to the invention of french fries. According to popular lore, they were sold by street vendors in the late 18th century near Pont-Neuf, the oldest bridge in Paris. This suggests there was no single identifiable creator, but that it was likely the invention of a Parisian peddler.

According to food historian Pierre Leqluercq, the first recorded mention of “french fries” was found in a Parisian book from 1775, and the first recipe was found in the 1795 cookbook La cuisinière républicaine. Soon therafter, a notable chef named Frederic Krieger began traveling through Belgium cooking “Paris-style fried potatoes.”

Francophone Fries?

Ironically, it’s believed that Americans were largely responsible for popularizing the dish by the name of “french fries.” One theory suggests that during WWI, American soldiers in Belgium discovered the food and referred to it as such because the local natives spoke French.

Another tale involves French pharmacist Antoine-Augustin Parmentier. When potatoes were originally brought to France from the New World, they were met with much skepticism. In an attempt to popularize the new vegetable, he held extravagant dinner parties where potatoes were served cooked in a variety of different ways. Potentially amongst his guests was President Thomas Jefferson, who is said to have encountered “potatoes served in the French manner” during his time as an ambassador in France and returned home with a recipe that would make its rounds throughout America.

American Fries?

While french fried potatoes became popular several centuries ago, they may not have originally looked or tasted quite the same as they do today. The potatoes were likely sliced into chunks or rounds rather than “sticks” and only cooked once. The term “frenching” simply refers to a method of food preparation in which ingredients are cut in even sizes so that all sides are exposed to heat, such as in an oven or fryer. The first known recipe for the crispy-on-the-outside, soft-on-the-inside, double-fried potatoes we love today did not appear until the early 20th century, in the Belgian book Traité d’économie domestique et d’hygiène.

In an interesting twist, modern American fast food chains are often credited with popularizing french fries on a global scale. During WWII, meat shortages resulted in restaurants searching for an inexpensive – yet filling – side dish, and french fries fit the bill. As these chains grew, more and more countries around the world began to enjoy the crispy, deep-fried potatoes we know so well. Today, nearly one third of all potatoes grown in the United States become frozen french fries, and the average American eats about 40 pounds of fries each year.

Whether you prefer Belgian-style with a dollop of mayo or classic American-style with ketchup, one thing is for certain – french fries have rightfully earned their place in the culinary hall of fame. While the exact recipe may have evolved over time, fried potatoes have fittingly remained a favorite dish in many cultures for centuries.

If you enjoyed this invention story, you might also like the ones about potato chips, cornflakes, and Heinz.


To learn more about Custom Powder Systems and the art of engineering, sign up for our newsletter.

Screenshot 2024-02-08 at 10.23.44 AM

Riata Center releases list of Cowboy100 honorees

Our beloved Chief Executive Officer, Denise McIntosh was named to list of upcoming honorees for the Riata Center for Innovation and Entrepreneurship to honor the third Cowboy100 Honoree Gala celebrating the business and leadership achievements of OSU graduate-owned or -led businesses. The gala will be held on March 29 at the Wes Watkins Center in Stillwater.

The Cowboy100 serves as a resource for students to engage with industry leaders, allows the Riata Center to be the reference point for entrepreneurship throughout the university and to raise funds for the Riata Center’s student programs and activities.

As part of the Cowboy100, the highest top-line revenue generating honorees for the years being measured are recognized on the Blazing10 list. While the overall list celebrates growth, the Blazing10 focuses primarily on top-line revenue, another important measure of business success.

“We are happy to release the list of the 2024 Cowboy100 honorees,” said Marc Tower, assistant dean for Outreach and Economic Development at the Spears School of Business. “The quality and diversity of this group is inspiring. We have companies and leaders from multiple industries, and from companies large and small. It is exciting to celebrate and share the hard work and success of these outstanding OSU graduates and Cowboy leaders. We look forward to celebrating their achievements on March 29.”

For more information and the complete list: https://news.okstate.edu/articles/business/2024/riata_center_releases_list_of_cowboy100_honorees.html

2023-CPS-Blog-Graphics-12

Inventions Ahead of Their Time: Automatic Doors

Daily, we encounter automatic doors almost everywhere we go – from supermarkets to office buildings, airports, and even our favorite coffee shops. They have become integral to modern architecture, making our lives more convenient and efficient. But have you ever wondered about the origins of this remarkable invention? (Hint: The idea predates modern electricity!)


The concept of automatic doors may seem like a recent development, but their origins trace back further than you might imagine. The first known automatic door dates back to ancient times when the Greeks and Romans utilized hydraulics to operate doors with the help of pulleys and weights. However, it wasn’t until around the 20th century that true automatic doors began to take shape as we know them today.

Opening New Doors

The visionary behind the modern automatic door (and the first vending machine!) was Heron of Alexandria, a Greek engineer and mathematician who lived in the first century AD. Heron is credited with inventing the “pneumatica,” a series of mechanical devices powered by air pressure generated by fire, which included an early version of automatic doors. These doors, also known as “Heron’s doors,” operated using pneumatics and relied on compressed air to open and close.

Although Heron’s automatic doors were ingenious, they were undoubtedly ahead of their time. The technology required to create a practical and reliable automatic door system wasn’t available until much later. It was in the 20th century that significant advancements in electronics, sensors, and control systems paved the way for the widespread adoption of automatic doors.

When One Door Closes, Another One Opens

Centuries later, in 1931, American engineers Horace H. Raymond and Sheldon S. Roby developed an optical sensor for an automatic door that was installed at Wilcox’s Pier Restaurant in West Haven, Connecticut. This revolutionary piece of equipment allowed waitresses to seamlessly carry trays through doorways without kicking them open.

Then, in 1954, the American engineers Dee Horton and Lew Hewitt created the first commercial sliding automatic door, known as the “Horton Automatics.” These doors relied on an electric motor and a complex mechanism of gears and rollers to facilitate smooth opening and closing.

Not long after that, the advent of microprocessors in the 1970s brought a new level of sophistication to automatic door systems. With the ability to integrate sensors, timers, and logic circuits, these doors became more intelligent and responsive. This evolution improved safety features such as presence detectors, which use infrared or motion sensors to detect a person’s approach and trigger the door’s opening.

Leaving the Door Open

With time, automatic doors also evolved beyond just sliding motion, encompassing various types that suit different architectural designs and functional requirements. Swing doors, similar to those found in supermarkets, were introduced to accommodate high-traffic areas. These doors utilize sensors to detect a person’s approach and open in response, facilitating a seamless entry or exit experience. Revolving doors, popularized in the early 20th century, have also undergone automation. This variety combines the benefits of energy efficiency, security, and smooth traffic flow, making them ideal for busy entrances such as airports and hotels.

As technology continues to advance, the future of automatic doors looks promising. Integrating artificial intelligence and machine learning algorithms may enable doors to adapt and learn from human behavior, anticipating movement patterns and adjusting door operation accordingly. Furthermore, the emergence of touchless technologies, such as gesture recognition and voice control, may redefine the user experience, allowing individuals to simply wave their hands or give a voice command to effortlessly gain access to a building, eliminating the need for physical contact.

From Heron’s ancient pneumatic doors to the cutting-edge automated systems we have today, the evolution of automatic doors is a testament to human ingenuity and the relentless pursuit of convenience and efficiency. These remarkable inventions have forever transformed our daily lives, making entryways more accessible, enhancing security, and optimizing traffic flow.

To learn about more inventions ahead of their time, check out these stories about motorcycles, electric cars, and corrective lenses.


To learn more about Custom Powder Systems and the art of engineering, sign up for our newsletter.

Screen Shot 2023-02-21 at 8.51.08 AM

Accidental Invention: Potato Chips

WARNING: Reading this article may incite a ravenous craving for potato chips. Viewer discretion is advised.


The crunchy, salty, irresistible snack that you know and love was first created nearly two centuries ago. Potato chips are said to have originated from an interaction between a picky restaurant patron and an irritated cook… But is that really where they came from?

The Legend of the Salty Chef

As the story goes, Native and African American chef George (Speck) Crum worked at Moon’s Lake House in Saratoga Springs, New York. One day in 1853, he encountered a particularly fussy eater. Cornelius Vanderbilt had ordered fried potatoes, which he then sent back because they were cut too thick. George, in the act of spiteful pettiness, proceeded to slice a potato as thinly as possible and fry it to a crisp… And Cornelius loved it.

As fun as this story is, historians have mostly debunked it. George Crum still, however, often receives credit for popularizing the snack, as he continued to serve them to enthusiastic patrons.

George’s “Saratoga Chips” quickly became a hit around town and then beyond Upstate New York. In 1860, the chef opened his restaurant, Crum’s House, where each table was served a delicious basket of his famous potato chips. The delicious crisps eventually became quite sought-after throughout the U.S., with the first “Saratoga Chips” being sold in grocery stores in 1895 by William Tappendonby in Cleveland, OH.

Other Cooks in the Kitchen

Over the years, other possible origin stories of the invention of the potato chip have surfaced.

George Crum’s coworker and sister, Catherine Adkins Wicks, also claimed that she was the true inventor of the potato chip. In some versions of the original story, she is said to have been the one who served the thin crips to Cornelius Vanderbilt. In another, Catherine was allegedly peeling potatoes when she accidentally dropped a slice in a pot of boiling fat.

Another Moon’s Lake House employee, “Eliza, the cook,” was claimed to have been making chips as early as 1849. A New York Herald article from the time said her “​​potato frying reputation is one of the prominent matters of remark at Saratoga.” Other restaurant individuals credited with the invention include the owners, restaurant manager Hiram Thomas, and several other cooks.

A different story from Smithsonian Magazine reports that “the earliest known recipe for chips dated to 1817 when an English doctor named William Kitchiner published The Cook’s Oracle, a cookbook that included a recipe for “potatoes fried in slices or shavings.” So, historians have largely agreed that we may, unfortunately, never know the true origin of the chip.

You Can’t Eat Just One

As you can probably guess, the popularity of potato chips grew exponentially, and recipes and production continued to evolve.

In the early 1920s, Herman Lay (name sound familiar?) began making his potato chips and selling them out of the trunk of his car. As he began commercializing the product, rumors spread that the chips had an aphrodisiac quality, which simply bolstered his sales even more.

Smithsonian Magazine also reports that “In 1926, Laura Scudder, a California businesswoman, began packaging chips in wax-paper bags that included not only a ‘freshness’ date but also a tempting boast – ’the Noisiest Chips in the World.’” The new packaging design helped the snack stay fresher and crispier for longer, making them even more popular and allowing them to be mass-marketed.

It wasn’t until the 1950s that potato chips started seeing flavoring, thanks to Irishman Joe “Spud” Murphy. With his founding of Tayto, he developed a manufacturing process that created some of the most popular flavors we still know and love: Sour Cream and Onion, Barbecue, and Salt and Vinegar.

Today, Americans consume about 1.85 billion pounds of potato chips each year, supporting an estimated $10.5 billion industry. Because, in the words of Lay’s 1961 spokesperson Bert Lahr, “You can’t eat just one!”

If you enjoyed this accidental invention story, you might also like the ones about silly putty and Corn Flakes.


Screen Shot 2023-02-21 at 8.31.29 AM

Accidental Invention: Silly Putty

Simple. Squishable. Moldable. Silly Putty has been a popular children’s toy for over 80 years. But did you know it wasn’t created for kids? Silly Putty’s origin story begins with an accidental discovery during the rubber shortage during World War II.


During WWII, many of the countries that produced rubber were being invaded at the time. Because of this, Allies faced an extreme rubber shortage. In an effort to combat the lack of this essential manufacturing item, the U.S. government contracted companies to create a synthetic rubber substitute that could be made from readily-available materials.

It was during this experimental process that one of the world’s most popular toys was inadvertently created.

A Goo With Interesting Properties

It all started in General Electric’s New Haven, Connecticut Lab in 1943, where inventor James Wright was testing potential methods to create synthetic rubber. During one attempt, he mixed boric acid and silicone oil, creating a gooey, stretchy substance. While it proved to be a poor substitute for rubber, its unique properties turned some heads.

This “nutty putty” was stretchier and bouncier than rubber, and it adhered to ink to make a perfect copy of whatever newspaper or comic book it touched. James soon began sending samples to labs around the world to find a potential use for his discovery. Unfortunately, there was not much interest from other scientists or the U.S. government, so the mysterious goo fell to the wayside.

Passing Around the Party Putty

In spite of there being no obvious practical use for the putty, James continued making it. The goopy goo eventually started making appearances as a novelty passed around at parties. At one such party, the rubbery substance was discovered by Ruth Fallgatter, owner of the “Block Shop” toy store. She began selling it in her catalog at “bouncy putty.” It quickly became a bestseller.

Ruth’s marketing consultant, Peter Hodgson, was so interested in the goo that he purchased its production rights and changed the name to “Silly Putty.” The product’s next release coincided with the Easter holiday, inspiring its famous plastic egg-shaped package. Priced at $1 each, the company sold 250,000 units of Silly Putty in the first three days… and nearly six million units in the first year.

Second Only to Crayola Crayons

The new toy was an instant success, second only to Crayons. Crayola eventually purchased the exclusive manufacturing rights to Silly Putty in 1977. Today, the company reveals that “although the exact formulas Crayola uses to make Silly Putty are proprietary, we can share that it is made primarily from silicone and color pigments.”

While still commonly known as a toy, Silly Putty has also a few practical uses, such as picking up dirt and lint and stabilizing wobbly table legs. It was also used on the 1968 Apollo 8 mission where astronauts used Silly Putty to secure their tools to surfaces while orbiting the moon.

We love practical inventions, but we also love the impractical fun ones, too! If you need help figuring out an idea, we’re here for you… no matter how “silly” it seems.


To learn more about Custom Powder Systems and the art of engineering, sign up for our newsletter.

CPS- blog graphics (1)

Inventions Ahead of Their Time: Corrective Lens/Contact Lens

Today, a large portion of the population wears eyeglasses or contacts regularly. But how long ago would you suppose corrective lenses were first utilized? 100 years ago? 500? 1,000? How about over 2,000? Yep, that’s right. Check out this article to learn the early origins of these incredibly useful tools.


Glasses have become pretty standard fare for a lot of individuals. It is estimated that 75% of the US population requires some sort of vision correction. Can you imagine what life would be like if that many people couldn’t see properly? Thankfully, the invention of modern-day corrective lenses began many centuries ago.

Magnifying Spheres

The earliest iteration of corrective lenses is commonly traced back to Ancient Rome, where philosopher Seneca the Younger brought spheres of glass and jewels for Emperor Nero to use for magnification. It was discovered that concave lenses could be used to enhance and enlarge small objects such as letters or organisms. Surprisingly, however, it took nearly a millennium for this early discovery to evolve into a more sophisticated design.

During the Renaissance, European inventors stumbled upon the writings of Muslim mathematician and scientist Alhazen which described the properties of convex lenses. Research and development began to take hold, and, in 1286, Italian friar Dominican Giordano da Pisa created what is believed to be the world’s first pair of eyeglasses. These were designed to be held in front of the face or perched on the nose.

Eyeglasses Continue to Evolve

Because the materials initially used to make eyeglasses were so expensive (e.g. crystal, leather, animal horns), they were largely only available to the wealthy. However, as literacy rates began to boom in the 15th century, demand for more affordable glasses quickly grew. The lenses shifted to being made out of glass, which was able to be manipulated to serve a greater spectrum of near/farsightedness needs.

The next (and debatably most useful) development in eyeglasses was becoming hands-free in the 18th century as they gained support to be held over the ears. Soon after, Benjamin Franklin introduced the concept of bifocals, and George Airy created lenses that would correct astigmatism. Then, as the Industrial Revolution greatly improved manufacturing processes, eyeglasses finally began to be available to nearly everyone.

Lighter, Cheaper, and More Convenient

Over time, eyeglasses continued to become lighter and cheaper with both frames and lenses able to be made from plastic. Protective coatings were also added to reduce glare and UV light for the wearer. Today, eyeglasses can be customized to help correct vision impairments all over the spectrum.

In recent years, we have also seen the contact industry take off, allowing an even more hands-free version of corrective lenses that are more convenient for many individuals. First made from glass in 1887, these “in-eyeglasses” went through about a century of development until they reached the soft gel versions that are most commonly worn today. Ironically, after thousands of years of experimentation, it seems that contacts are the most similar to the original magnifying spheres of glass.

It’s no question that the invention of corrective lenses made a huge impact on the world. We at Custom Powder Systems love to see how technology develops over time. If you have a game-changing idea that you’d like to bring to life, let us know how we can help!


To learn more about Custom Powder Systems and the art of engineering, sign up for our newsletter.

Mothers-of-Invention_Olga-D.-González-Sanabria

Mothers of Invention: Olga D. González-Sanabria

A minority in many facets of her life, Olga D. Gonzalez-Sanabria was one of the few women who earned an engineering degree from the University of Puerto Rico and eventually became the highest-ranking Hispanic employee at NASA’s Glenn Research Center. Her invention of the long cycle-life nickel-hydrogen battery has been a critical tool in the advancement of energy storage for space exploration.


A brilliant woman who eventually became the highest-ranking Hispanic employee at NASA’s Glenn Research Center, Olga D. Gonzalez-Sanabria made great strides in the field of chemical engineering – especially recognized for her achievements related to energy storage technologies for space.

A Natural-born Engineer

Born, raised, and educated in Puerto Rico, Olga was destined to become an engineer, as she took an interest in math and science early on in her life. During a high school career fair in the 1970s, she was taken by the idea of helping to solve the energy crisis, and soon joined the ranks of the few women studying engineering at the University of Puerto Rico. She earned both a bachelor’s and master’s degree in chemical engineering, the latter from the University of Toledo.

In 1979, Olga began her career at NASA’s Glenn Research Center researching energy storage technologies for space in the Electrochemistry Branch of the Solar and Electrochemistry Division. Over the years, she made great strides in various research departments until officially being promoted to management in 1995. She ultimately became the ​​director of the Engineering Directorate, a position she held until the end of her service in 2011.

A Battery for Outer Space

During her tenure at NASA, Olga was constantly working to create and improve various tools to be used for space exploration. Most notable was her team’s advancements with nickel-hydrogen fuel cells, a critical power source that was known to deplete too quickly. After much research and experimentation, the scientists significantly improved the separators that isolate oxidation and reduce voltage losses within the battery.

Olga and her team’s creation of the long cycle-life nickel-hydrogen battery was a monumental achievement and was put to use in the International Space Station power system. This type of battery, on average, could run for 40,000 cycles, and last for 10-15 years. In 1988, Olga’s team received an R&D 100 award for their invention.

An Engineering Role Model

Olga eventually honorably retired from NASA after 32 years of service, but not before earning numerous awards for her achievements, including the Women of Color in Technology Career Achievement Award (2000), Outstanding Leadership Medal (2002), Ohio Women’s Hall of Fame Inductee (2003), YWCA Women of Achievement Award (2004), and Presidential Rank of Meritorious Executive (2007). Most recently, she was inducted into the NASA Glenn Research Center Hall of Fame (2021).

Today, Olga is president and co-owner of her own company, GX Matrix Consulting LLC. She is also passionate about mentoring young women, encouraging them towards the STEM field, and exemplifying a positive and accomplished role model. She says that the most valuable advice she would give to her younger self would be: “Be more assertive, document your progress and achievements. It will help you as you move up the ladder and remind you that you are contributing to the mission.”

To hear more stories about professional women whose perseverance has made them inspirational figures in their fields, check out our podcast, The Art of Engineering.


To hear more about the art of engineering, sign up for our newsletter.