Published December 23, 2025

Planet Labs: The Operating System Anchoring a Prosperous World

For generations, maps were monuments to single moments in time. The Great Trigonometrical Survey took 70 years to measure India. Early commercial satellites cost hundreds of millions and captured cities like judicial warrants—one snapshot at a time, weeks apart. We watched a living planet through a strobe light, missing the trillions of changes happening in the dark intervals between flashes. Planet Labs saw a structural inefficiency everyone else had accepted. Founded in 2011 by NASA scientists Will Marshall, Robbie Schingler, and Chris Boshuizen, the company proved that mass-produced satellites using smartphone components could image the entire Earth daily for a fraction of legacy costs. This blog post traces how Planet Labs industrialized orbital sensing, achieved profitability, and evolved into the planetary measurement layer now anchoring climate intelligence, defense systems, and autonomous machines requiring real-world spatial reasoning at global scale.

Grand Illusion of Static Geography

Since the dawn of modern cartography, the map has been treated as a finished product. From the hand-drawn charts of the Great Trigonometrical Survey, a grueling 70-year effort to measure the Indian subcontinent, to the first digitized grids of the late 20th century, the fundamental paradigm of geography was that the Earth could be treated as a static stage. Mountains, borders, and coastlines were mapped as if they were permanent fixtures, largely because the physical and financial toll of updating them was so immense. For generations, a map was not just a guide but a hard-won monument to a single moment in time. 

Figure. Thacker’s Reduced Survey Map of India (1897), a consolidated product of the Survey of India’s 19th-century measurement campaigns that compresses years of field surveys into a single, static snapshot of the landscape. (image source)

When the first commercial Earth observation (EO) satellites arrived, they reinforced this frame-by-frame reality. These were massive, exquisite machines, multi-ton, room-sized spacecraft like Landsat or WorldView, designed to take portraits of the planet. They offered stunning clarity, but they were precious and scarce. Because each billion-dollar system was a singular, high-stakes investment, the view of the world only refreshed at the pace of long-lead procurement cycles.

Figure. Landsat 8 undergoing final integration at Vandenberg, part of a long-running Earth observation program built around monolithic systems optimized for stability over update frequency. (image source)

If you wanted to see a specific city or a farm, you had to task the satellite, a process that felt more like negotiating a judicial warrant than buying data. You waited for the orbital path to align, prayed for a cloudless window, and paid a premium for a single snapshot. The resulting history was fragmented. We were looking at a dynamic, living planet through a strobe light, missing the trillions of events, the planting of crops, the movement of convoys, the clearing of forests, or the sudden spread of wildfire that occurred in the long, dark intervals between shots. The industry could see the where, but because updates were so rare and costly, it remained entirely blind to the when.

By the late 2000s, the EO industry was trapped in what some engineers called the “Iron Triangle”: resolution, frequency, and cost. To get high-quality data, you needed massive telescopes. To get massive telescopes, you needed heavy, expensive spacecraft. To launch those spacecraft, you needed government-scale budgets and decade-long development cycles. The prevailing model assumed that high resolution over selected targets, acquired occasionally, was sufficient. That assumption held when the world moved slowly, but it broke the moment environmental, economic, and security systems began to interact in real-time.

The breakthrough didn't happen at a legacy aerospace prime, but at NASA Ames, where a group of young scientists, Will Marshall, Robbie Schingler, and Chris Boshuizen, recognized a massive structural inefficiency. They saw consumer electronics improving on an annual cycle while satellite technology remained stagnant for decades. They were challenged by the realization that mass-produced smartphones already possessed superior processors and sensors compared to spacecraft costing hundreds of millions of dollars. This insight led to PhoneSat, an experiment that proved off-the-shelf electronics could survive and outperform legacy hardware in orbit, across several dimensions. By 2013, the team successfully launched three mini-orbiters at a cost of just $7,000 each. Compared to $850M flagships like Landsat 8, or even the $164M price tag of modest Discovery-class missions like Genesis, PhoneSat was a revelation.

Figure. PhoneSat 1.0 during a high-altitude balloon test (left) and imagery transmitted from orbit (right), demonstrating a radically low-cost approach to satellite design using consumer smartphone hardware as a viable spaceborne sensor. (image source)

This clarified a larger point: the bottleneck was no longer hardware capability, but the industry's risk-aversion. Traditional exquisite systems were built to never fail because they were too expensive to replace, but the founders realized that if you made satellites cheap enough to fail, you could afford to succeed at a much larger scale. Drawing on their NASA experiences, they saw that if a smartphone already contained the essential “guts” of a satellite, there was no reason a spacecraft had to be large or hand-crafted. 

In 2011, the trio left NASA to found Planet Labs. They were driven by a mission that prioritized planetary utility over the traditional perfectionism of the aerospace industry. They shifted the goal from building a better satellite, which was traditionally a massive and expensive system designed for a decade of flawless operation, to building a new measurement layer for the planet. This shift changed the fundamental paradigm of geography.

Figure. Planet Labs’ founders with a Dove CubeSat, marking the break from flagship-era satellites toward agile, repeatable space systems. (image source)

Instead of focusing solely on the clarity of a single lens, they asked: If the planet is changing continuously, what would it take to make those changes visible continuously? This objective meant treating time as the primary constraint, rather than just pixel size. It required accepting that the true product would not be a single image, but the ability to compare yesterday and today for any point on Earth. They called this founding vision Mission 1: to image the entire landmass of the Earth, every single day, creating a living index of physical change. On November 9, 2017, six years after founding, Planet announced they had achieved this goal, successfully moving geography from the era of the Portrait to the era of the Scan.

Industrializing the SmallSat Revolution

Transitioning from the concept of a daily scan to its execution was fundamentally an industrial challenge. To image the entire landmass every 24 hours, Planet needed hundreds of sensors in orbit, which would be a constellation without precedent. This was a scale that rendered the traditional bespoke manufacturing model obsolete. Success required moving the vision out of the government lab and into a high-throughput operation where failure was treated as a design input rather than a catastrophe.

The trio moved into a San Francisco garage to begin this work. By choosing the Bay Area over traditional aerospace hubs, they were able to recruit from a talent pool that lived and breathed software culture. This allowed them to apply a software-centric philosophy to space, coining the term “Agile Aerospace”. Instead of spending a decade perfecting a single, massive spacecraft, they adopted a strategy of rapid hardware iteration. They launched “Builds”, design versions that functioned like software updates. If a sensor underperformed in Build 5, the fix was integrated into Build 6 while it was still on the assembly line.

Figure. An early Dove CubeSat, optimized for temporal frequency and global continuity, treating the satellite not as a standalone mission but as one node in a persistent measurement layer. (image source)

The first iteration of this new philosophy was the Dove. Each Dove followed a 3U CubeSat format, which is roughly the size of a loaf of bread, and was built around commodity processors and sensors. In those early days, the pace was frantic. This strategy reached its first milestone in April 2013 with the launch of Dove 1 and Dove 2. These were the company's first Minimum Viable Products (MVPs) in orbit, designed to prove that off-the-shelf electronics could deliver high-quality data from space. When crucial hardware arrived late for the first production run, the founders turned to their internship program to help hand-build the fleet. Under their guidance, these interns helped crank out 26 of the first 30 units on a schedule that veteran aerospace engineers said was impossible. This was the first proof that the model could scale. The goal was no longer just to build a satellite, but to build an assembly line that could sustain a massive constellation.

This model proved its worth through trial by fire. When an Antares rocket exploded in 2014, taking 26 Doves with it, the company did not stall. Because the satellites were low-cost and rapidly reproducible, the loss was an operational setback rather than a mission-ending catastrophe. They absorbed the blow as a bad day at the office and simply launched the next flock. With the practice of Agile Aerospace, redundancy had successfully shifted from the individual satellite to the entire constellation.

Success also required a new approach to the launch market. Planet pioneered the use of the International Space Station (ISS) as a primary deployment platform, packing flocks of Doves into cargo bags for routine resupply missions. Once aboard, the satellites were moved through the station's airlock and spring-loaded into orbit by robotic arms. By utilizing the ISS and early rideshare opportunities on commercial rockets, Planet industrialized the launch process. They proved that the vacuum of space could be accessed on a predictable and recurring schedule.

While the Doves were perfecting the daily scan, a parallel revolution was unfolding at Skybox Imaging. Founded in 2009, just one year before Planet Labs, Skybox took a distinct engineering path. Where Planet’s Doves were optimized for high-frequency coverage, Skybox built its SkySats as slightly larger, mini-refrigerator sized microsatellites designed for high-fidelity performance. They were the first commercial microsatellites capable of sub-meter resolution and high-definition video, which were capabilities that had previously required satellites the size of school buses. This chapter is deeply personal to our team. Our Partner, Tom Ingersoll, served as the CEO of Skybox, leading the company from its early concept through the launch of its first high-resolution systems. Under Tom’s leadership, Skybox proved that small satellites could meet the do-or-die reliability standards of the intelligence and defense communities while costing only a fraction of legacy systems. His tenure culminated in a landmark $500M acquisition by Google in 2014 (which renamed the business Terra Bella), which served as the first major exit of the entrepreneurial space era.

Figure. Tom Ingersoll, Space Capital Partner and former Skybox Imaging CEO, with CPO Dan Berkenstock and an early SkySat, showing that image fidelity and deployment speed no longer had to trade off. (image source)

Recognizing that this revolution would eventually scale to define the entire market, Space Capital invested in Planet in 2015. We saw that the same principles of agility and cost-efficiency Tom had championed at Skybox were the blueprint for a new industrial age in orbit. We recognized that geospatial intelligence (GEOINT) was evolving from a snapshot business into a global utility that required both high-frequency coverage and high-resolution clarity. While the Dove constellation provided an unprecedented daily scan of the Earth, the vision required the high-fidelity Zoom Lens that Skybox had perfected.

In 2017, these two essential strands of the revolution merged when Planet acquired the SkySat fleet from Google. This was the moment the hardware journey reached maturity, uniting these two capabilities under a single, dynamic architecture. This created a “Tip and Cue” system: the Doves act as a global net to detect changes anywhere on Earth, tipping the system to a point of interest, while the SkySats are cued to zoom in with sub-meter precision to confirm the details. As the hardware continued to iterate, the original Dove evolved into the SuperDove, featuring enhanced telescopes and additional spectral bands that allowed the daily scan to meet the rigorous calibration standards of global scientific and defense agencies. 

Figure. A SkySat undergoing integration at Skybox Imaging, whose acquisition by Google marked the first major validation and exit of the modern EO small-satellite era. (image source)

The revolution has now moved beyond shrinking hardware to standardizing a Common SmallSat Platform. This is a modular and high-performance bus that underpins a diverse family of sensors. This platform powers Pelican, the high-agility successor to SkySat, which entered full commercial operations in 2025. Alongside it sits Tanager, a hyperspectral mission developed with NASA JPL and the Carbon Mapper coalition. Launched in late 2024, Tanager uses over 400 spectral bands to pinpoint methane and CO2 leaks at the facility level, turning a small satellite into a precise tool for global environmental enforcement.

Figure. Planet’s integrated constellation, including SuperDoves for daily global monitoring, SkySats and Pelicans for high-resolution tasking, and Tanager for hyperspectral detection, is designed with spectral bands optimized for cross-sensor analysis across a single planetary measurement layer. (image source)

Building on this modular foundation, the latest pinnacle of this evolution is Owl, the next-generation monitoring fleet. Designed to deliver near-daily, 1-meter class imagery of the entire landmass, these satellites feature a planned 60+ km swath width. By covering four times more ground per pass than its predecessors, Owl provides the expansive reach making it possible to detect large-scale trends across time and space with unprecedented precision and clarity.

From the shoebox-sized Doves to the wide-swath power of Owl, we have witnessed the transformation of a scrappy lab into an industrial-scale operation that redefined the global sensing layer. This vision has now scaled into the largest Earth-observation fleet in history. Since 2013, Planet has successfully deployed over 650 satellites and currently maintains approximately 200 active spacecraft in orbit, a constellation that captures more than 25 terabytes of imagery every single day.

Figure. Owl, Planet’s next-generation monitoring fleet, reflects how the architecture has evolved far beyond the Dove era, delivering substantially greater power, capacity, and coverage within the same continuous planetary measurement system. (image source)

Forging the Operating System

Planet’s transformation from a small-satellite manufacturer into the architect of a planetary operating system (OS) has unfolded over more than a decade. Each chapter has added a new layer of capability. In the early 2010s, the company focused solely on scale and how to place enough sensors in orbit to remake Earth as a dataset. This constellation of 28 Doves deployed from the ISS proved that daily global monitoring was achievable. Yet as imagery began flowing in torrents, a deeper truth emerged. The constraint was no longer in orbit but on the ground. Customers needed more than just images. They required consistent and calibrated signals that could feed actionable insights directly into financial models and government workflows.

The period from 2014 to 2017 was foundational for this software shift. The acquisition of BlackBridge, a German operator of the RapidEye satellite fleet, brought a decade-long historical archive into the fold. This gave Planet temporal depth for the first time, allowing customers to look back in time to see how a specific landscape had changed over ten years. With the SkySat integration finalized in 2017, the OS gained its final sensor layer, allowing Planet to fuse high-resolution tasking directly into its broad-area monitoring workflows.

These assets harmonized Planet into a multi-resolution system. Doves provided the daily scan, RapidEye provided the historical baseline, and SkySats provided the high-definition detail. In parallel, the acquisition of Boundless Spatial, a leader in open-source mapping software, embedded deep GEOINT expertise into the company. This helped move Planet from a simple data provider to a developer-first ecosystem where users could build their own applications on top of Planet's data.

By the late 2010s, Planet was no longer merely collecting data. It was organizing Earth intelligence. Engineers focused on harmonization, which is the process of making images from different satellites look and behave exactly the same. This meant that an image taken over Brazil could be compared perfectly to an image taken over Berlin. Planet Basemaps emerged as a key product. These are seamless and cloud-free mosaics that behave like a living map rather than a series of disjointed snapshots. Underneath, the company automated its ground segment. This vertically integrated infrastructure handled terabytes of data per day with minimal human intervention.

Even as older satellites like RapidEye were retired in 2020, their archives remained a permanent part of the foundation. That same year, Planet stepped into the role of global public infrastructure through the NICFI tropical forest program, an initiative that transitioned in January 2025 into the permanent Tropical Forest Observatory (TFO). This shift established Planet’s data as the official baseline and primary transparency layer for global environmental governance. This objective, physical data is exactly what is supporting carbon markets as they navigate challenges with liquidity and standardization. As governments and financial institutions move toward stricter reporting requirements, Planet provides the independent verification needed to make environmental claims auditable, whether that means tracking facility-level methane leaks with the Tanager fleet or monitoring the carbon sequestration of a protected rainforest in real-time.

The early 2020s marked an ascent into derived analytics, which is the process of extracting specific facts from images. In 2021, the VanderSat acquisition brought microwave-based sensing technology that could see through clouds to measure soil moisture. These capabilities were transformed into Planetary Variables, which turned raw pixels into easy-to-read indicators like crop health or water content. This same year, Planet established itself as the sector's financial leader by going public on the NYSE on December 8, 2021 (Ticker: PL). As the first definitive pure-play Earth intelligence company, its debut served as a catalyst for the industry, sparking a flurry of major space infrastructure companies that followed them into the public markets. 

Figure. PlanetScope is one of the primary products built on Planet’s operating system, shown here delivering analysis-ready, surface-reflectance imagery so the same location can be compared accurately over time and used directly in machine-learning and analytical workflows. (image source)

In 2023, the acquisitions of Salo Sciences for wildfire risk and Sinergise for its Sentinel Hub data platform modernized how data was distributed. This allowed users to stream data from Planet and government satellites side-by-side in real-time. By the middle of the decade, the OS entered a new phase of sovereign intelligence. In July 2025, Planet signed a landmark €240M agreement with the German government to provide dedicated capacity on its Pelican satellites. This deal represents more than just a large contract as it marks the birth of a new era where allied nations require their own dedicated sentinels in the sky. By doubling its production capacity with a new Berlin facility in late 2025, Planet is moving from a centralized data provider to a global infrastructure partner. This allows nations to control the tasking and security of their own dedicated sensors as a cost-effective alternative to maintaining their own massive satellite programs. 

The final piece of the OS was the integration of AI directly into the sensor. In November 2025, Planet acquired Bedrock Research, a GEOINT AI provider, to serve as the brain for the Owl fleet. Combined with major defense contracts for maritime monitoring, Planet has moved from simply taking images to broadcasting real-time alerts. By the end of 2025, this platform strategy reached a historic turning point. In its Q3 fiscal 2026 report, Planet announced its fourth consecutive quarter of adjusted EBITDA profitability, a milestone supported by a high-margin recurring revenue model delivering a 60% non-GAAP gross margin. This momentum is anchored by a $734M backlog, including $672M in firm, multi-year contracts. Not long after these results, broader market recognition of Planet’s operational execution and intensifying interest in space as an industry helped the stock achieve a new all-time high of $21, bringing the company’s market cap to $6.5B. Planet’s systematic expansion has proven that GEOINT is not a nice-to-have, but the bedrock subscription driving the decision-making of the global economy.

Race for the Orbital Cloud

The most consequential choices in national security, climate resilience, and financial stability can no longer rely on human reaction time alone. While Generative AI and Large Language Models have revolutionized communication, they remain untethered from reality. An LLM can describe a cup of coffee, but it lacks the inherent understanding of gravity, friction, or spatial coordination required to move that cup across a room. On a global scale, digital-only AI has no concept of how our physical world actually functions, making it an unreliable partner for critical decision-making. This is the natural evolution of AI: moving out of the digital sandbox and into the real world to become a physical superintelligence that can solve the massive productivity gap. 

Figure. As Physical AI moves from controlled environments into the real world, machines need more than local perception. Planet’s GEOINT supplies planetary-scale context that complements local sensors, already supporting autonomous vehicles, drones, and infrastructure systems operating in dynamic environments. (image source)

Now widely recognized as Physical AI, this term has become the industry shorthand for the bridge providing the spatial reasoning that allows a digital brain to inhabit a physical body. While massive fleets of humanoid robots have yet to hit the streets, the shift is already beginning with the most established robotics on our roads: autonomous vehicles. Today, they use local cameras to avoid immediate obstacles, but it cannot perform complex trip-planning or navigate a changing environment without a high-fidelity understanding of the world it inhabits. A future fleet of autonomous drones for urban logistics will require that same global awareness for the same reason. In a large-scale mining operation, for example, a fleet of autonomous machines needs more than just onboard radar to function; they require fresh GEOINT context to navigate shifting terrain, monitor structural hazards, and identify the optimal locations to drill. Physical AI is the missing piece for the era of autonomous labor, but it requires a fusion of perspectives: while the machine’s local sensors handle split-second reactions, Planet helps build the World Models that provide the operational context beyond what systems can observe locally. This marks a transition from passive tools that wait for instructions to active systems that possess true operational agency.

But the mission to scale this intelligence has hit a physical wall on Earth. In major tech hubs, terrestrial data centers are increasingly throttled by power grid congestion and the immense water required for cooling. In orbit, the rules of physics offer a radical escape. By generating immense power at the source and shedding heat into the absolute zero of space, an orbital data center can operate at densities that are becoming impossible on Earth. Without atmospheric interference and with the ability to maintain near-constant peak power exposure, solar conversion rates in orbit are up to eight times higher than on the ground. This vision reached a defining milestone in late 2025 with Project Suncatcher, a collaboration between Google and Planet Labs. The goal is to deploy Google’s Tensor Processing Units (TPUs) onto Planet’s solar-powered satellites, moving high-density AI inference directly to the edge.

When these high-density nodes are linked together via high-speed optical lasers, they transcend individual hardware to become the Orbital Cloud. This is a shift in where the central intelligence of the global economy resides. Beyond energy efficiency, the Orbital Cloud offers a unique strategic advantage in its global reach and infrastructural independence. By processing data at the source and integrating with the next generation of high-throughput SatCom networks, this model points toward a future where critical intelligence can be generated and accessed anywhere on Earth, even in regions where terrestrial power, fiber, or data centers are constrained, unreliable, or politically inaccessible.

However, the challenge today is no longer getting silicon into orbit, but sustaining high-density compute under the physical constraints of space. Doing so requires new architectures that address two fundamental realities: thermal management and radiation. In the vacuum of space, heat cannot be carried away by air. It must dissipate through radiative cooling. High-performance AI chips generate intense heat that would traditionally require heavy and complex liquid pumps, which add weight and significant failure points. Instead, researchers are looking toward passive and high-conductivity materials like annealed pyrolytic graphite. Much like a candle wick pulls wax, these materials efficiently draw heat away from processors and conduct it to the spacecraft exterior, where it can radiate into the vacuum. This enables AI silicon to operate at higher densities within safe thermal limits. Protecting that silicon follows the same hardware-software fusion Planet used to de-risk commodity electronics a decade ago. Rather than relying solely on slow and prohibitively expensive rad-hardened chips, the industry is demonstrating that modern processors can operate reliably in orbit when paired with advanced shielding and sophisticated software-based error correction.

Figure. Microsoft’s Fairwater data center shows how far Earth-based AI infrastructure has scaled. Planet’s collaboration with Google explores in-orbit compute, where continuous solar exposure and direct heat radiation could eventually complement terrestrial power and cooling constraints. (image source)

While the primary goal is high-density processing, the engineering required to master this cloud architecture makes every spacecraft more resilient and effectively raises the floor for all operations. A stronger “brain” on board allows any satellite to process insight extraction, troubleshoot its own hardware, perform predictive maintenance, and autonomously avoid debris, transforming the spacecraft from a performance sensor into a self-aware node. This effectively raises the floor for all orbital operations. With the forthcoming Owl fleet already carrying onboard NVIDIA Jetson AI platforms, the first layer of interpretation that distills raw pixels into actionable data, now happens at the source. The Earth is no longer merely a place we observe, but a world that can trigger its own chain of response. A wildfire ignition or a vessel deviation can be flagged autonomously, sending an alert directly to a downstream system before the raw data even hits the ground.

Looking back to when we first backed Planet a decade ago, they were a small group of young pioneers actively pushing the frontier of a hardware revolution. They proved that an agile model could fundamentally redefine our understanding of geography and achieve industrial scale in orbit. Today, that vision has matured into a global infrastructure powerhouse that helps the physical world see, measure, think, and reason. In the next decade, we are excited to see Planet’s continued rise as the anchor for this new age of superintelligence. No longer just an observer of our world, Planet is becoming a foundational layer for a more autonomous, resilient, and prosperous future.

Back to all blog posts
Back to all blog posts