Skip to main content
Applied Urban & Engineering Geology

The Subsurface as a Commons: Rethinking Urban Geotechnical Data for Equitable Resilience

This guide explores a fundamental shift in how cities manage the ground beneath them. We argue that treating geotechnical data—the detailed information about soil, rock, and groundwater—as a privately held commodity undermines long-term urban resilience and equity. Instead, we propose framing the subsurface as a shared commons, where data is collaboratively managed for the public good. We examine the ethical and sustainability imperatives driving this change, from preventing disproportionate cli

Introduction: The Hidden Crisis Beneath Our Feet

Urban resilience is often visualized through iconic infrastructure—seawalls, green roofs, reinforced buildings. Yet, the most critical component for a city's long-term survival and equity lies hidden from view: the subsurface. For decades, geotechnical data—the detailed profiles of soil composition, groundwater levels, bedrock depth, and contamination—has been treated as a proprietary asset, locked away in private engineering reports or disparate municipal files. This fragmentation creates a profound knowledge asymmetry. When a major developer plans a high-rise, they commission a detailed site investigation. Meanwhile, a community group advocating for flood protection in a historically underserved neighborhood lacks access to the same caliber of data to make their case. This guide argues that this paradigm is not just inefficient; it is ethically and sustainably untenable. We must reconceptualize the subsurface as a commons—a shared resource managed for collective benefit. This shift is essential for building cities that are not only physically resilient to climate shocks but also socially equitable in how they distribute risk and opportunity. The path forward requires new models of data stewardship, collaboration, and governance that prioritize long-term public value over short-term private gain.

The Core Problem: Data Silos and Systemic Blindness

The traditional model creates systemic blindness. In a typical redevelopment project, a geotechnical firm drills boreholes, analyzes samples, and produces a report solely for that client's foundation design. That data, often revealing crucial insights about regional aquifer connections or landslide-prone strata, then disappears into a filing cabinet. A decade later, a city agency planning a new stormwater tunnel a few blocks away must budget to drill new boreholes, incurring duplicate costs and potentially missing the interstitial patterns that only a synthesized dataset could reveal. This waste of resources is a direct sustainability failure, consuming energy, materials, and funds for redundant work. From an equity lens, the consequences are starker. Lower-income communities, often built on less desirable land with higher flood risk or poor soil conditions, rarely attract the private investment that generates high-quality subsurface data. This lack of information perpetuates a cycle of disinvestment and increased vulnerability, as public agencies cannot effectively target mitigation resources without a clear understanding of the ground risks.

Defining the "Subsurface Commons"

The concept of a "commons" reframes the subsurface not as empty space to be claimed, but as a complex, interconnected ecological and geological system that provides essential services—foundation support, water filtration, heat exchange, and carbon storage. Treating it as a commons means recognizing that data about this system is a public good, akin to weather data or topographic maps. Its value multiplies through sharing and integration. The goal is not to eliminate private investigation but to ensure that the insights gleaned from it contribute to a shared, evolving knowledge base. This creates a virtuous cycle: each new project enhances the collective understanding, which in turn lowers costs, reduces unforeseen risks, and informs better planning for all future projects, especially those serving the public interest.

A Note on Practical Application

The recommendations in this guide are based on observed industry trends and professional discourse. They are intended for informational purposes to stimulate discussion and planning. For specific legal, regulatory, or financial decisions related to geotechnical data sharing or infrastructure projects, consulting with qualified legal, engineering, and planning professionals is essential.

The Ethical and Sustainability Imperative for Change

Moving beyond efficiency arguments, the drive to treat subsurface data as a commons is rooted in deeper ethical and sustainability obligations. Urban resilience cannot be equitable if the knowledge required to build it is hoarded. Climate change acts as a threat multiplier, but its impacts are not felt uniformly. A neighborhood sitting on compactable fill or with a high water table will experience more severe flooding and infrastructure damage during intense rainfall than one on stable bedrock. Without a transparent, accessible map of these vulnerabilities, adaptation investments risk flowing to areas with the loudest political voices or the highest property values, not the greatest need. A commons approach directly addresses this by making risk visible to all, empowering community advocates, planners, and policymakers with the same factual basis for decision-making. This transparency is a prerequisite for justice in resilience planning, ensuring that historically marginalized communities are not further burdened by hidden geological hazards.

Sustainability Through Avoided Waste

From a pure resource perspective, the current model is remarkably wasteful. Redundant site investigations represent a significant carbon footprint from drilling rigs, laboratory testing, and transportation. More subtly, the lack of integrated data leads to suboptimal design. A foundation may be over-engineered because the designer lacked context about nearby soil conditions, using more concrete and steel than necessary. A geothermal system might be undersized because the installer had no data on local thermal conductivity. A commons database enables right-sized, efficient design, minimizing embodied carbon and operational energy use. It also prevents catastrophic failures that require complete reconstruction—the ultimate in unsustainable practice. By seeing the subsurface as a whole system, we can design with it, not just on top of it, leveraging its natural capacity for stormwater infiltration, thermal mass, and stability.

Long-Term Stewardship vs. Short-Term Extraction

The ethics of the commons extend to intergenerational equity. The subsurface is a non-renewable resource in many ways; once contaminated or structurally compromised, remediation is extraordinarily difficult and costly. Treating data as private property aligns with a mindset of extraction—get what you need for your project and move on. A commons mindset fosters stewardship. It encourages contributors to think about the long-term health of the urban underground, as their data will become part of a permanent record. This can influence investigation practices, prompting more thorough contamination screening or monitoring well installations that benefit future users. It transforms geotechnical engineering from a service for a single client into a profession contributing to the enduring safety and habitability of the city.

Building Trust Through Transparency

A critical, often overlooked, benefit is the cultivation of public trust. When infrastructure projects encounter delays, cost overruns, or failures, the cause is frequently attributed to "unforeseen ground conditions." To the public, this can sound like negligence or excuse-making. A shared subsurface data platform demystifies this process. It allows agencies to show, with evidence, the complexity of the challenges they face. It can engage communities in planning by visualizing the constraints and opportunities beneath their feet. This transparency turns the subsurface from a source of mysterious risk into a understood component of the shared urban environment, fostering a more informed and collaborative public dialogue about resilience investments.

Core Concepts: From Private Asset to Public Good

To operationalize the subsurface commons, we must dismantle old assumptions and build new conceptual frameworks. The core shift is a change in value perception: the value of geotechnical data is not in its scarcity but in its connectivity. A single borehole log has limited utility; its true worth is realized when correlated with hundreds of others to reveal depositional patterns, groundwater gradients, or the extent of a historic landfill. This requires rethinking data ownership, quality, and structure. It moves from a project-centric document (the geotechnical report) to a system of structured, machine-readable data points (depth, soil classification, Standard Penetration Test value, groundwater level) tagged with precise location coordinates. This granularity allows for aggregation, spatial analysis, and modeling that are impossible with PDFs in a digital drawer. The goal is to create a living, three-dimensional model of the urban underground—a "subsurface digital twin" that is continuously updated with new investigations.

Data as a Common-Pool Resource

Elinor Ostrom's Nobel-winning work on common-pool resources provides a vital lens. She identified principles for successfully managing shared resources like fisheries or irrigation systems, which apply directly here. These include clearly defined boundaries (the geographic extent of the city), rules matching local conditions, collective-choice arrangements where data contributors have a voice in governance, monitoring, graduated sanctions for non-cooperation, conflict-resolution mechanisms, and recognition of rights to organize. A subsurface data commons is not a free-for-all; it is a carefully governed institution. The resource units (data points) are subtractable—one user's analysis doesn't prevent another's—but the resource system (the shared database) must be protected from overuse or degradation (e.g., by pollution with low-quality data).

The Role of Standards and Interoperability

A commons cannot function without shared language. The lack of universal standards for soil classification, laboratory testing methods, and data formats is a major technical barrier. One firm may use the Unified Soil Classification System (USCS), while another uses the AASHTO system; a third may record moisture content differently. Building a commons necessitates the adoption of, or mapping to, a common schema. This often involves championing existing open standards from bodies like the Open Geospatial Consortium (OGC) or building lightweight, pragmatic local standards that define minimum required fields and quality controls. Interoperability—the ability of different systems to exchange and use data—is the technical bedrock of the commons. It ensures that data from a small environmental nonprofit can be integrated with data from a multinational engineering firm and a city's public works department.

Incentive Structures: What Motivates Contribution?

A purely altruistic model is unlikely to succeed at scale. The governance model must create clear incentives for private entities to contribute their data. The most powerful incentive is reciprocal value: contributors get far more data out than they put in. A developer planning a site gets immediate context from dozens of nearby historical investigations, potentially reducing their own initial investigation scope. Other incentives can include regulatory streamlining (fast-tracked permits for projects that submit data in the standard format), reputational benefits, or mandatory contribution for projects on public land. The key is to design a system where contributing is the easiest, most beneficial path forward, aligning private interest with the public good.

Comparing Governance Models for a Subsurface Data Commons

There is no one-size-fits-all model for governing a subsurface commons. The right approach depends on local legal frameworks, political culture, existing infrastructure, and funding sources. Below, we compare three primary governance archetypes, analyzing their pros, cons, and ideal application scenarios. This comparison is crucial for cities or regions deciding how to initiate their own commons journey.

ModelCore DescriptionProsConsBest For
Public-Steward ModelA government agency (e.g., city planning, public works, geological survey) acts as the central custodian, curator, and distributor of the data commons.High public trust and accountability; potential for regulatory integration (e.g., linking to building permits); stable, long-term funding potential from public budgets.Can be slow to innovate; may lack technical expertise in data management; vulnerable to political shifts and budget cycles; may struggle to incentivize private sector participation.Cities with strong, technically capable public agencies and a political mandate for open data; regions where subsurface risk is a clear public safety issue (e.g., seismic zones).
Multi-Stakeholder ConsortiumA legally independent entity governed by a board representing key stakeholders: government, engineering firms, academia, utilities, developers, and community groups.Balances diverse interests; can move faster than pure government; draws on private sector innovation and expertise; funding can be diversified (membership fees, grants, service fees).Governance can be complex and contentious; requires significant upfront effort to establish trust and rules; may face challenges in ensuring data quality from diverse contributors.Metropolitan regions with many independent jurisdictions; areas with a strong culture of public-private partnership; initiatives aiming for rapid, collaborative scaling.
Platform-Cooperative ModelA user-owned and operated digital platform built on cooperative principles. Data contributors are member-owners who govern the platform and share in any generated value.High level of contributor buy-in and trust; aligns incentives perfectly (owners benefit from network growth); can be highly agile and innovative.Extremely difficult to bootstrap without critical mass; legal structure can be complex; may lack the perceived neutrality of a public entity; long-term sustainability can be a challenge.Niche professional communities (e.g., geothermal installers, environmental consultants) seeking to solve a specific data gap; initiatives with strong grassroots or cooperative movement support.

Hybrid and Phased Approaches

In practice, many successful initiatives begin as a pilot within a public agency (Public-Steward) to establish credibility and core datasets, then evolve into a more independent consortium to scale and incorporate private data. A critical success factor is the clear separation of the governance body (which sets policy, standards, and access rules) from the technical platform team (which manages the database and IT infrastructure). This ensures that technical decisions serve the governance principles, not the other way around.

A Step-by-Step Guide to Building Your Local Commons

Launching a subsurface data commons is a multi-year endeavor requiring technical, social, and political work. This step-by-step guide outlines a pragmatic path from concept to operational reality, emphasizing stakeholder engagement and iterative development.

Step 1: Convene a Founding Coalition

Do not attempt this alone. Identify and bring together 5-10 key organizations that represent the ecosystem: a forward-thinking city or county agency, a major university with geoscience or engineering departments, a representative from the local professional engineering society, a large utility company (water, gas, electric), and a community-based environmental organization. The goal of this initial group is not to solve all problems, but to draft a shared vision statement and secure resources for a feasibility study.

Step 2: Conduct a Landscape Audit and Legal Review

Before building anything, understand what exists. Task the coalition with auditing existing data holdings: what subsurface data do public agencies already have? In what formats? What are the legal and contractual barriers to sharing it? (Many old consultant contracts claim full copyright over reports). Simultaneously, review relevant federal, state, and local laws regarding data liability, public records, and professional licensure. This step identifies low-hanging fruit (datasets that can be shared immediately) and major roadblocks to address.

Step 3: Develop a Minimum Viable Product (MVP) Schema

Avoid "boiling the ocean." Define the smallest set of data fields that would provide immediate value. This often includes: location (coordinates, elevation), basic lithology (soil/rock type using a common classification), depth to groundwater, and Standard Penetration Test (SPT) N-values. Create a simple, open data standard (e.g., a CSV template or JSON schema) for this MVP. The simplicity lowers the barrier for initial contributions and allows for quick wins.

Step 4: Launch a Pilot Project with a Clear Use Case

Select a geographically bounded area (e.g., a specific watershed or downtown district) facing a known challenge, such as chronic basement flooding or redevelopment pressure. Use the MVP schema to aggregate all existing public data for that area. Then, engage private firms working on projects there, offering them access to the aggregated dataset in exchange for contributing their new data in the standard format. This demonstrates tangible value in a controlled setting.

Step 5: Establish Provisional Governance and Incentives

Based on the pilot's lessons, draft provisional governance rules. Decide on an initial governance model (see comparison table). Create clear, written policies on data quality control, licensing (strongly recommend a Creative Commons CC-BY or similar open license), attribution, and liability disclaimers. Design the first formal incentives, such as a commitment from the city planning department to provide expedited review for projects that submit compliant data.

Step 6: Scale the Platform and Community

With a proven pilot and provisional governance, seek more stable funding (grants, member dues, public allocation). Invest in a more robust, user-friendly data platform that allows for visualization, querying, and basic API access. Launch a formal outreach and training program for potential contributors and users, including city planners, community boards, and engineering students.

Step 7: Iterate, Refine, and Institutionalize

A commons is never "finished." Regularly convene your user community to review the data schema, governance rules, and platform features. Add new data types based on demand (e.g., thermal conductivity for geothermal, contamination data). Work towards formal adoption of the commons into official city planning and capital project workflows, ensuring its long-term sustainability as a critical piece of public infrastructure.

Real-World Scenarios: The Commons in Action

To move from theory to practice, let's examine two composite, anonymized scenarios that illustrate the transformative potential of a subsurface data commons. These are based on common patterns observed in urban development and resilience planning.

Scenario A: Equitable Climate Adaptation in a Coastal Neighborhood

A historically underserved coastal neighborhood, "Marina View," is experiencing increased tidal flooding and sewer backups during storms. The city has limited funds for adaptation. Traditionally, without detailed, neighborhood-scale subsurface data, the city might default to a visible but potentially superficial solution like raising street curbs. However, this city has a nascent subsurface commons. Planners query the database, integrating decades of scattered borehole data from small housing repairs, utility work, and a few past environmental studies. The synthesized 3D model reveals a critical pattern: a layer of low-permeability clay sits beneath much of the neighborhood, preventing stormwater from infiltrating and causing it to pond. It also shows areas where the clay layer is absent, connecting to a deeper, permeable aquifer. With this knowledge, the city and community co-design a targeted solution: a distributed network of infiltration basins and "sump" gardens located precisely where the geology allows, combined with a focused program of green infrastructure grants for residents in the clay-heavy zones. The commons data enabled a more effective, less costly, and equitable solution that addressed the root cause, not just the symptom, and directed resources based on scientific need rather than political influence.

Scenario B: Accelerating Sustainable Redevelopment

A mid-sized city is redeveloping a former industrial "brownfield" site into a mixed-use, sustainable district with ambitions for district-scale geothermal heating and cooling. Under the old model, each parcel developer would need to conduct their own costly and time-consuming thermal response test (TRT), a process that can take weeks and cost tens of thousands of dollars. With a well-established subsurface commons that includes geothermal parameters, the city master developer contributes a single, comprehensive TRT and a dense grid of borehole thermal conductivity measurements at the project outset. This data is uploaded to the commons under a license that allows use by any project within the district boundary. Now, every individual building developer can design their geothermal borefield with high confidence using the shared data, eliminating redundant testing, cutting months off project timelines, and reducing upfront costs. This makes sustainable technology more financially viable, accelerates the overall decarbonization of the district, and provides a powerful market incentive for future developers to both use and contribute to the commons.

Scenario C: Crisis Response and Recovery

Following a significant earthquake, rapid damage assessment is critical. Emergency responders need to know which areas are most likely to have suffered liquefaction (where soil loses strength) to prioritize search and rescue and inspect critical infrastructure. A city with a comprehensive subsurface commons can immediately generate and share liquefaction susceptibility maps based on the integrated soil type and groundwater data. This allows for a scientifically informed response, potentially saving lives and focusing resources. Post-disaster, the commons becomes invaluable for reconstruction, providing every engineering team working on repairs with immediate context, preventing design conflicts, and ensuring rebuilt infrastructure is resilient to the actual ground conditions.

Common Questions and Addressing Concerns

Transitioning to a commons model raises legitimate questions from various stakeholders. Addressing these concerns head-on is essential for building broad-based support.

Won't this expose proprietary insights and hurt my competitive advantage?

This is the most common concern from private engineering firms. The key is differentiating between raw, factual site data and the interpretive engineering analysis, design, and recommendations that constitute a firm's intellectual property. A commons should only mandate the sharing of the former—the standardized measurements (soil types, depths, test results). The value of a top-tier firm lies in its expert judgment, modeling, and innovative design solutions applied to that data. In fact, access to a richer regional dataset enhances the quality of that expert analysis, allowing firms to deliver better, lower-risk designs to their clients. The competitive landscape shifts from hoarding data to competing on the quality of interpretation and engineering innovation.

What about liability? If someone uses "my" data and something fails, am I responsible?

Liability is a serious concern that must be addressed through clear legal frameworks. A well-designed commons includes robust disclaimers and data licenses that explicitly state the data is provided "as-is" for informational and planning purposes only, and that it does not constitute a substitute for site-specific investigation and professional design. Contributors are not warranting the data for use in final design. The model is analogous to public weather data: a city provides rainfall statistics, but a civil engineer is still responsible for designing a culvert to handle that rainfall. Legal protocols should be reviewed by qualified counsel to ensure they provide appropriate protection for contributors while maintaining the utility of the data.

How do we ensure data quality and prevent "garbage in, garbage out"?

Governance rules must include quality assurance/quality control (QA/QC) protocols. These can be tiered: Level 1 - automated validation (e.g., checking that coordinates are within the city bounds, depths are logical). Level 2 - peer or curator review, where submissions from new contributors are spot-checked by a trusted expert or committee. Level 3 - user feedback and flagging systems, allowing users to report potential errors. Additionally, the platform can display metadata about each data point: who collected it, when, using what standard, and its QA/QC status. This allows users to apply their own judgment, much like a researcher evaluates the provenance of any source.

Who pays for building and maintaining the platform?

Sustainable funding is critical. A mixed-revenue model is often most robust. Potential sources include: public core funding (viewing it as essential public infrastructure like a base map), membership fees from large institutional users (utilities, universities), modest service fees for advanced API access or large-volume commercial users, and grants from foundations focused on climate resilience or open data. The goal is to avoid relying on a single source that could disappear, ensuring the commons' operational independence and longevity.

Conclusion: Building a Resilient Foundation for All

The challenge of urban resilience in the 21st century demands that we see our cities not just as collections of buildings above ground, but as integrated systems rooted in a complex geological foundation. Treating the subsurface as a commons, and its data as a public good, is a paradigm shift with profound implications for equity, sustainability, and long-term fiscal responsibility. It moves us from a fragmented, reactive, and wasteful approach to a collaborative, proactive, and efficient one. By pooling our knowledge of the ground beneath us, we can make smarter investments, protect the most vulnerable communities from hidden risks, and unlock innovative, low-carbon solutions like geothermal energy. The path requires careful navigation of technical, legal, and social hurdles, but the frameworks and steps outlined here provide a starting point. The subsurface belongs to no one and yet is foundational to everyone. It's time our management of its data reflected that reality, building a truly resilient foundation for generations to come.

About the Author

This article was prepared by the editorial team for this publication. We focus on practical explanations and update articles when major practices change.

Last reviewed: April 2026

Share this article:

Comments (0)

No comments yet. Be the first to comment!