Real problems worth solving

Browse frustrations, pains, and gaps that founders could tackle.

Independent grocery stores operate on net margins of 1-2%. On a $100 grocery transaction, the store keeps $1-$2 in profit. But credit card interchange fees charged by Visa and Mastercard average 1.5-2.5% of the transaction, meaning the card networks collect $1.50-$2.50 on that same sale. In many transactions, Visa and Mastercard literally make more money than the grocery store does. This matters because grocery stores cannot refuse credit cards without losing customers to competitors who accept them. Unlike restaurants or small retailers who can add surcharges, grocery stores operate in a hyper-competitive market where any friction at checkout drives shoppers to the Walmart or Kroger down the street. The result is that grocers are trapped: they must accept cards to stay competitive, but every card swipe erodes the razor-thin margin that keeps their lights on. For independent grocers, swipe fees are now their second-highest operating cost after labor, and unlike labor, they have zero ability to negotiate the rate downward. This problem persists because Visa and Mastercard operate a duopoly that controls over 80% of card network transactions. Merchants have no real alternative payment rail with comparable consumer adoption. Legislative efforts like the Credit Card Competition Act have stalled repeatedly due to bank lobbying. The fundamental structural issue is that interchange fees are set by the card networks, not negotiated in a competitive market, so grocers are price-takers in a system designed to extract maximum rent from every transaction. The Visa-Mastercard settlement proposed in 2024 would only reduce fees by 0.1% over five years, which is negligible against margins this thin.

finance0 views

ASCE's 2025 Infrastructure Report Card projects a $3.7 trillion gap between current planned infrastructure investment and what is needed to bring U.S. infrastructure to good working order — a 43% increase from the $2.59 trillion gap reported in 2021, despite the passage of the $1.2 trillion Infrastructure Investment and Jobs Act in between. The overall infrastructure grade improved marginally from C- to C, but nine of the graded categories still received D-range grades. Most notably, energy infrastructure was downgraded from C- to D+, driven by the collision between exponentially growing electricity demand (from EVs, data centers, AI, and electrification) and a grid where 30-46% of transmission and distribution assets are beyond their useful life, according to Bank of America analysis. This gap is not an abstract number — it translates directly into the daily experience of Americans. It means bridges with weight restrictions that force trucks onto longer routes, increasing shipping costs that flow into consumer prices. It means water main breaks that shut off service to neighborhoods. It means dam failures during floods. It means transformer shortages that cause multi-day blackouts after storms. It means sewage in rivers. Each category of failing infrastructure imposes costs on different populations, but cumulatively, ASCE estimates that infrastructure deficiencies cost each American household $3,300 per year in lost time, higher prices, and reduced quality of life. The gap grows despite increased spending because of three compounding forces. First, construction cost inflation has accelerated — labor shortages, material costs, and supply chain disruptions mean every infrastructure dollar buys less than it did four years ago. Second, climate change is accelerating the deterioration of existing infrastructure through more extreme heat, flooding, and freeze-thaw cycles, meaning maintenance costs are rising faster than budgets. Third, new demands like EV charging, grid-scale battery storage, and data center power are creating infrastructure needs that did not exist a decade ago, layering new requirements on top of the existing repair backlog. The Bipartisan Infrastructure Law was the largest infrastructure investment in decades, but it was designed to address a $2.59 trillion gap — and by the time the money flows, the gap has grown to $3.7 trillion. The funding is running behind the problem.

construction0 views

The United States has over 100,000 railroad bridges, many built in the early 20th century, carrying freight and passenger trains that are heavier than anything their designers anticipated. As of 2025, the Federal Railroad Administration has exactly seven employees trained to assess bridge safety. Seven people for 100,000+ bridges. The Department of Transportation announced a plan to train 163 track inspectors to spot critical bridge problems, but this is an expansion of the track inspector role — not the creation of dedicated bridge engineers. The gap between the number of structures requiring expert assessment and the workforce available to perform it is not a shortfall; it is a chasm. The consequences of this inspection deficit are measured in derailments and near-misses. When a rail bridge fails, the result can be a train derailment carrying hazardous materials through populated areas — as the nation saw in East Palestine, Ohio in 2023. Rail bridges are not subject to the same federal inspection standards as highway bridges; railroad companies largely self-inspect using their own employees and standards. There is no equivalent to the National Bridge Inventory for rail bridges, meaning the federal government does not have a comprehensive, standardized database of rail bridge conditions. Without consistent, independent inspection data, there is no way to systematically prioritize which bridges are most dangerous or to hold railroad operators accountable for deferred maintenance. This problem persists because of a regulatory framework that treats railroads as private infrastructure owners responsible for their own safety, dating back to an era when railroad companies were the most powerful and well-capitalized entities in America. Today, Class I railroads have focused on cost-cutting through Precision Scheduled Railroading, reducing maintenance-of-way crews and deferring capital expenditures on bridge rehabilitation to boost quarterly earnings. Federal regulation has not kept pace: the FRA's bridge safety standards (49 CFR Part 237) require railroads to have bridge management programs but place minimal requirements on inspection frequency, inspector qualifications, or public reporting of findings. The result is a system where the entities responsible for bridge safety have financial incentives to minimize maintenance spending, and the regulator lacks the staff to verify compliance.

construction0 views

In the first two months of 2026, metro Detroit communities were hammered by an extraordinary surge of water main breaks. Harper Woods alone experienced 14 breaks — nearly half of the 38 it had in all of 2025. Roseville dealt with 40 breaks, and Eastpointe recorded 25. These are not large cities; they are inner-ring suburbs with limited public works budgets and pipe networks installed in the 1920s through 1950s. Each break means residents lose water service, roads are torn up for emergency repairs, and thousands of gallons of treated water are wasted. Nationally, the American Society of Civil Engineers estimates 700-850 water main breaks occur every day in North America, costing over $3 billion annually in repairs. The human impact goes beyond inconvenience. When water mains break, water pressure drops throughout the system, which can allow contaminants to enter through cracks and joints — creating boil-water advisories that affect hospitals, schools, restaurants, and homes. Businesses that depend on water — laundromats, restaurants, car washes, manufacturers — lose revenue for every hour of outage. Emergency repairs are far more expensive than planned replacements, consuming budgets that could otherwise fund proactive infrastructure investment. The U.S. loses nearly 20% of its treated drinking water — 2 trillion gallons per year — to leaks in aging distribution systems, costing utilities and customers $6.4 billion annually. That is treated, drinkable water being pumped, chemically processed, and then lost into the ground. The structural cause is straightforward but politically intractable: replacing water mains costs $1-2 million per mile, and most American cities have hundreds to thousands of miles of pipe that need replacement simultaneously. The replacement rate nationwide is roughly 0.5% of pipe per year, meaning the full system would take 200 years to replace at current pace. Municipal water rates would need to increase dramatically to fund accelerated replacement, but rate increases are deeply unpopular and disproportionately burden low-income residents. Federal infrastructure funding helps but is a fraction of the estimated $625 billion needed for drinking water infrastructure over 20 years. The result is a strategy of 'run to failure' — operating pipes until they break, then making emergency repairs — which is the most expensive possible approach but the only one that fits within annual operating budgets.

construction0 views

Northern Virginia — particularly Loudoun County, which hosts the densest concentration of data centers on Earth — has hit a hard physical constraint: the transmission and distribution grid cannot deliver enough electricity to meet demand. Dominion Energy has imposed interconnection delays of four to seven years for new data center projects seeking grid connections, up from 18-24 months just five years ago. The bottleneck is not generation capacity but transmission infrastructure — aging high-voltage lines and substations that were built decades ago for a suburban residential load profile, not for facilities that each consume as much electricity as a small city. In July 2024, a 230 kV transmission line fault in Northern Virginia dropped 1,500 MW of data center load across 60 facilities and 25 substations. The economic stakes are enormous. Virginia's data centers represent a multi-billion-dollar industry, and the companies behind them (Amazon, Microsoft, Google, Meta) are threatening to build elsewhere if power cannot be delivered. But the grid constraint also affects ordinary residents: the 833% increase in 2024 PJM capacity auction prices for the 2025-2026 market — driven largely by data center demand — translates directly into higher electricity bills for every Virginia ratepayer. The U.S. Department of Energy has warned that without grid upgrades, Northern Virginia could face 400+ hours of power outages per year by 2030. This means the aging grid is simultaneously throttling economic development and degrading reliability for existing customers. The problem persists because transmission infrastructure takes 7-10 years to plan, permit, and build, while data center demand has been doubling every 2-3 years. The grid was designed for slow, predictable load growth — not exponential surges driven by AI training workloads. Dominion's rate case now proposes a new electricity rate class for data centers, requiring them to pay for 85% of contracted transmission capacity, but this addresses cost allocation, not the physical constraint of insufficient wires and transformers. Building new transmission lines requires rights-of-way through some of the most expensive real estate in America, environmental reviews, and coordination across multiple jurisdictions. Meanwhile, 67% of utility spending on transmission and distribution in 2024 — $63 billion — went to replacements and upgrades of existing equipment rather than new capacity, illustrating how much of the budget is consumed just keeping the existing aging grid functional.

construction0 views

On January 28, 2022, the Fern Hollow Bridge on Forbes Avenue in Pittsburgh collapsed, sending a transit bus, several cars, and their occupants plunging into the ravine below. Ten people were injured. The NTSB investigation, concluded in February 2024, determined the collapse was caused by corrosion and section loss in a transverse tie plate on the bridge's southwest leg — damage that had been documented in inspection reports for years. The NTSB found that the City of Pittsburgh 'failed to act on repeated maintenance and repair recommendations from inspection reports.' Additionally, PennDOT contractors conducting inspections on behalf of the city failed to identify fracture-critical areas and did not calculate load ratings accurately. The collapse happened the morning President Biden was visiting Pittsburgh to promote his infrastructure bill — a coincidence that turned a local bridge failure into a national symbol of infrastructure neglect. But the real significance is what the NTSB findings reveal about systemic inspection failures. The bridge was inspected regularly on paper, but the inspections were not compliant with federal guidance, missed critical structural elements, and the recommendations that were made were simply ignored by the city. This means the nation's bridge inspection regime — the primary safety mechanism preventing bridge collapses — can produce a false sense of security. A bridge can be 'inspected' and still collapse because the inspection was inadequate or the findings were never acted upon. The structural reason this happens is a disconnect between the entities that inspect, the entities that fund repairs, and the entities that prioritize spending. Pittsburgh's bridge inventory is massive — the city has more bridges than any other U.S. city — and its maintenance budget is chronically insufficient. Inspection recommendations compete with every other municipal spending priority (police, fire, schools, potholes), and there is no federal mechanism to force a city to act on inspection findings. The $25.3 million federal commitment to replace Fern Hollow was reactive, not proactive. Nationally, over 46,000 bridges are structurally deficient, and 63,085 are posted for load restrictions. The average bridge age is 47 years, approaching the 50-year design life. The IIJA allocated $40 billion for bridges, but ASCE warns that without life-cycle maintenance planning, today's 'fair' bridges will slide into 'poor' condition.

construction0 views

On January 19, 2026, a 72-inch diameter section of the Potomac Interceptor — a 54-mile sewer trunk line built in the 1960s — collapsed beneath Clara Barton Parkway in Montgomery County, Maryland. Over the following three weeks, more than 244 million gallons of raw, untreated wastewater poured into the Potomac River and the C&O Canal National Historical Park, making it one of the largest sewage spills in U.S. history. Large boulders had nearly completely blocked the pipe, and investigators believe the blockage traces back to the original 1960s construction, when builders placed large rocks too close to the pipe, allowing them to eventually fall in. E. coli levels in the Potomac spiked to hundreds of times the EPA's safe limit. While drinking water intakes were upstream of the spill, the contamination closed recreational access to miles of the river during a period when communities rely on it. The political fallout was immediate — Virginia, Maryland, and DC officials clashed over responsibility and cost-sharing for a pipe that crosses jurisdictional boundaries. DC Water, which operates the interceptor, scrambled to build a bypass using the C&O Canal itself to reroute wastewater, containing the overflow within 21 days. But subsequent inspections revealed two more sections of the Potomac Interceptor rated at high risk of similar failure. This incident exposes a systemic problem: America's major sewer trunk lines were built 50-70 years ago, and most have never been comprehensively inspected using modern technology because they are too large, too deep, and carry too much flow to easily take offline. The Potomac Interceptor serves the entire DC metropolitan area's wastewater needs, making it a single point of failure with no redundancy. The pipe's failure mode — construction defects from the 1960s only now manifesting — illustrates how aging infrastructure can harbor invisible time bombs. There is no national inventory of trunk sewer condition, and utilities rarely have the budget or political will to proactively replace lines that appear to be functioning. The EPA estimates $630 billion in wastewater infrastructure needs over the next 20 years, but the U.S. water utility sector faced an estimated $110 billion funding gap in 2024 alone.

construction0 views

The United States has over 16,700 dams classified as 'high hazard potential,' meaning their failure would cause loss of life and significant property destruction. Of these, approximately 2,500 — roughly 15% — are assessed as being in poor or unsatisfactory condition, according to ASCE's 2025 Infrastructure Report Card (which gave dams a D+ grade). The nation's dams average 64 years old, and by 2025, seven in ten are over 50 years old — past the design life they were engineered for. Meanwhile, the average state dam safety official is responsible for overseeing 190 dams, making thorough, regular inspections physically impossible. The consequences of this neglect are not hypothetical. In June 2024, Minnesota's 114-year-old Rapidan Dam partially failed after three days of heavy rain, washing away two electrical substations, destroying homes, and requiring emergency evacuations. The dam had been rated 'poor' condition since at least April 2023, and the county faced an impossible financial choice: spend $15 million on repairs with little return, or $82 million to remove it entirely. Vermont's dams average 89 years old, and after July 2023 flooding, inspectors found 57 dams had been overtopped and 50 sustained notable damage. These are not outliers — they are previews of what will happen as aging dams face increasingly extreme weather events. The structural reason this crisis deepens is a catastrophic funding mismatch. ASCE estimates $185 billion is needed between 2024 and 2033 to maintain and upgrade dam infrastructure. But the federal High Hazard Potential Dam Rehabilitation Grant Program, authorized at $60 million per year, received zero federal funding in both fiscal years 2023 and 2024. For FY2025, only $7.48 million was made available for state dam safety programs — a rounding error against the actual need. Most of these dams are privately owned and the owners lack the financial resources for major rehabilitation. States lack the inspectors to even identify which dams are most dangerous, let alone fund repairs. The result is a slow-motion disaster where we know thousands of dangerous dams exist but systematically underfund every mechanism designed to address them.

construction0 views

Philadelphia's sewer system — like those in roughly 700 other U.S. municipalities — combines stormwater runoff and human sewage into the same pipes. When it rains more than moderately, the combined volume overwhelms treatment plant capacity, and raw sewage mixed with stormwater is discharged directly into the Delaware and Schuylkill rivers. According to a 2024 PennEnvironment Research & Policy Center report, Philadelphia's waterways received an average of 12.7 billion gallons of combined sewer overflows per year from fiscal years 2016 to 2024. A single overflow point (T14) discharged nearly 2 billion gallons of raw sewage between July 2023 and June 2024 alone. The city's sewers overflow into local waterways an average of 65 or more days per year. This is not an abstract environmental statistic. It means Philadelphia's rivers — used for recreation, fishing, and as drinking water sources — are potentially too polluted for human contact for up to 195 days per year. Residents in neighborhoods near overflow points experience sewage backups into their basements during heavy rains. The health risks include exposure to E. coli, hepatitis A, giardia, and other waterborne pathogens. Property values near overflow-affected waterways are depressed. The city has spent billions on its 'Green City, Clean Waters' plan to reduce overflows through green infrastructure (rain gardens, permeable pavement), but even at full implementation, CSOs will not be eliminated — only reduced by 85%. The problem persists because rebuilding a combined sewer system into separate storm and sanitary systems is astronomically expensive — estimated at $10-20 billion for Philadelphia alone — and would require tearing up nearly every street in the city. The infrastructure was designed in the 19th century when the engineering assumption was that dilution solved pollution. Federal Clean Water Act enforcement has been inconsistent, and the EPA's own 2004 estimate of 850 billion gallons of annual CSO discharges nationwide has not been updated in two decades, meaning regulators are operating with outdated data. Climate change is making the problem worse: heavier, more frequent rainstorms trigger more overflows from systems already at capacity.

construction0 views

More than 40 million distribution transformers in the United States — the gray canisters on wooden poles that step voltage down to serve homes and businesses — have already exceeded their designed service life. When one of these units fails, the replacement lead time has ballooned from a few weeks pre-pandemic to 120 weeks for standard units and up to 210 weeks (four years) for large power transformers, according to the North American Electric Reliability Corporation. Wood Mackenzie projects a 30% supply deficit for power transformers and 10% for distribution transformers in 2025. Prices have surged 4-6x since 2022, with unit costs up 77% for power transformers and up to 95% for some distribution transformer classes. The downstream consequences are immediate and severe. When a transformer fails during a heat wave or winter storm and there is no replacement available, entire neighborhoods lose power for days or weeks instead of hours. Utilities are now hoarding transformers — buying units they don't yet need and storing them, which further tightens supply for utilities that lack the capital to stockpile. Rural electric cooperatives and small municipal utilities are hit hardest because they cannot compete with large investor-owned utilities for limited manufacturing slots. The result is a two-tier reliability system where wealthy utilities maintain service and poorer ones watch their grids degrade. The structural root cause is a decades-long hollowing out of domestic transformer manufacturing capacity. The U.S. has only one domestic supplier of grain-oriented electrical steel (GOES) — Cleveland-Cliffs, with plants in Pennsylvania and Ohio. Imports now account for 80% of power transformer supply and 50% of distribution transformer supply. Transformer manufacturing requires highly specialized labor that takes years to train, and OEMs cite labor shortages as a key reason they cannot scale output. Meanwhile, demand has surged 116% for power transformers and 41% for distribution transformers since 2019, driven by data centers, EV charging, and electrification. OEMs have announced $1.8 billion in capacity expansions since 2023, but new capacity takes 3-5 years to come online — meaning the shortage will persist well into the 2030s.

construction0 views

Chicago has more lead service lines than any other American city — over 412,000 — and is legally required by the EPA's 2024 Lead and Copper Rule to replace all of them. But the city's own replacement plan, submitted to the Illinois EPA in April 2025, targets just 8,300 replacements per year, which means finishing in 2076 — three decades past the federal deadline of mid-2049. At roughly $35,000 per line, the total estimated bill exceeds $12 billion. So far, the city has replaced about 14,000 lines over five years at a cost of more than $400 million, and has drawn only $70-90 million of an approximately $325 million federal loan that expires in 2026. Why does this matter? Because every day those pipes remain in the ground, hundreds of thousands of Chicago households are drinking water that has passed through lead. Lead exposure causes irreversible neurological damage in children, lowers IQ, and increases behavioral problems. For adults it raises blood pressure and kidney damage risk. This is not a theoretical hazard — it is a daily, ongoing poisoning of a major American city's residents. The economic consequences compound: lower educational attainment, higher healthcare costs, reduced lifetime earnings. Studies estimate that lead exposure costs the U.S. economy $80 billion annually in lost productivity. This problem persists structurally because of a vicious funding circle. The city cannot afford to replace pipes faster without massive outside funding, but federal programs are being cut — Congress is considering slashing $125 million from lead pipe replacement funding. Meanwhile, the city's water rates would need to increase dramatically to self-fund, which is politically toxic in a city where residents already distrust utility billing. The pipes were installed over a century when Chicago actually mandated lead service lines by city code (until 1986), creating a uniquely concentrated legacy problem that no incremental annual budget can realistically address at the required pace.

construction0 views

Even when clients still want human-created art, AI has destroyed pricing power for freelance illustrators. Clients now arrive at negotiations having already generated 'good enough' AI images for free, and use those as a baseline: 'Why should I pay you $3,000 for a book cover when Midjourney gave me something decent for $10?' The 2025 Boekmanstichting survey of 700+ creatives found that one in five freelance artists has lost income, with commercial assignments — the most lucrative category — hit hardest. The National Endowment for the Arts found that 61% of working artists worry AI will devalue their labor within five years, up from 38% in 2023. The deeper damage is not just to current income but to the entire career pipeline. When entry-level commercial work (product illustrations, social media graphics, marketing assets) gets automated, junior artists lose the stepping stones they need to develop into senior illustrators. A concept artist at a game studio told Blood in the Machine that their boss used AI to generate textures that would have required hiring another artist. The History Channel is airing seasons of 'Life After People' that heavily feature AI-generated visuals. Each of these substitutions eliminates a job that would have trained the next generation of artists. The pipeline of skilled illustrators is being cut at the entry level, which will create a talent crisis in 5-10 years even for work that AI cannot do. This problem persists because the market for illustration has always been fragmented and informal. There is no guild with binding rate minimums, no standardized contracts, and no collective bargaining mechanism. Each freelancer negotiates individually against clients who can now credibly threaten to use AI instead. Unlike actors (SAG-AFTRA) or screenwriters (WGA), who won AI protections in their 2023 contracts through organized strikes, visual artists have no equivalent union with the leverage to negotiate industry-wide protections. The few protections that exist — like the EU's AI Act requiring disclosure of training data — are geographically limited and have no enforcement mechanism that individual artists can access.

legal0 views

By early 2026, the AI licensing landscape for news publishers has split into haves and have-nots. Large publishers like News Corp ($50 million/year from Meta), the New York Times (deal with Amazon), and the Associated Press (deal with Google) have signed multi-year licensing agreements. But small and mid-size publishers face an impossible choice: sign a licensing deal for a fraction of their content's value, or refuse and watch their search traffic evaporate as AI-generated summaries replace click-throughs. Google's AI Overviews present the most acute version of this problem. When Google uses a publisher's content to generate an AI Overview that answers a user's question directly on the search results page, the user never clicks through to the publisher's site. Publishers who opted out of Google's AI training crawlers discovered that Google still uses their content for AI Overviews through its regular search index — there is no way to opt out of AI Overviews without opting out of Google Search entirely. For small publishers who depend on search traffic for 40-70% of their audience, this is not a choice at all. The Digiday publisher scorecard showed that most publishers rated Big Tech's licensing deals poorly, viewing them as inadequate compensation for the value being extracted. This problem persists because of an extreme power asymmetry. There are a handful of AI companies (Google, OpenAI, Meta, Amazon) and thousands of publishers. Each individual publisher's content is dispensable to an AI model trained on billions of pages, but the AI platform's search traffic is indispensable to the publisher. This gives AI companies overwhelming negotiating leverage. The RSL Collective and IAB Tech Lab's CoMP framework are attempting to standardize licensing terms, but standardization only helps if publishers have the collective bargaining power to enforce those terms. The New York Times and Chicago Tribune sued Perplexity in December 2025 for copyright infringement, but most small publishers cannot afford litigation. The result is a two-tier internet: large publishers get paid (poorly), small publishers get scraped for free, and the journalism that communities depend on becomes economically unviable.

legal0 views

In the first quarter of 2025 alone, celebrities were targeted in deepfake scam advertisements 47 times — an 81% increase compared to all of 2024. Taylor Swift, Scarlett Johansson, and Brad Pitt are among the most impersonated. A French woman lost $850,000 to a scammer using AI-generated images of Brad Pitt over an 18-month romance scam. Brazilian authorities arrested suspects in a scheme using deepfake videos of Gisele Bundchen in Instagram ads, with over 20 million reais in suspicious funds. McAfee found that 72% of Americans have seen fake celebrity endorsements; 10% lost money, averaging $525 per victim. The human cost goes beyond celebrities. When a deepfake ad uses a doctor's likeness to sell fake supplements, as happened in a case covered by NBC's Today show in 2025, real patients make health decisions based on fraudulent endorsements. When scam crypto ads use deepfakes of trusted financial figures, real people lose their savings. The celebrities whose likenesses are stolen suffer reputational damage and spend enormous resources on takedowns that are ineffective — for every ad removed, ten more appear. The UK's Advertising Standards Authority issued 10 Scam Ad Alerts specifically for deepfake video ads in 2025, all for cryptocurrency scams. This problem persists because of a fundamental gap between content generation speed and content moderation capacity. A scammer can generate a convincing deepfake ad in minutes using freely available tools. Platforms rely on user reporting and automated detection, both of which lag far behind generation capabilities. Section 230 of the Communications Decency Act shields platforms from liability for user-generated content, removing the economic incentive to invest heavily in detection. The TAKE IT DOWN Act (signed May 2025) and the DEFIANCE Act (passed January 2026) address nonconsensual intimate deepfakes specifically, but neither covers commercial deepfake scam ads. Victims must pursue individual civil claims against anonymous, often overseas scammers — a functionally impossible task.

legal0 views

In November 2022, open-source developers filed a class-action lawsuit against GitHub, Microsoft, and OpenAI, alleging that Copilot was trained on code from public repositories licensed under open-source licenses like GPL, MIT, and Apache — licenses that explicitly require attribution, license notices, and in the case of GPL, copyleft obligations. Copilot generates code that sometimes reproduces licensed code verbatim, without any attribution or license notice. The developers argued this violates both copyright law and the specific terms of the open-source licenses their code was released under. The federal court dismissed the copyright infringement claims in 2023 for lack of specific examples of copied code, and later dismissed the DMCA claims as well. Only narrow breach-of-contract theories survived. This outcome effectively tells open-source developers: you can release code under a license that requires attribution, a multi-billion-dollar company can ingest that code to train a commercial product that generates code without attribution, and you have no practical legal remedy. For the millions of developers who contribute to open source under the social contract that their license terms will be respected, this is a betrayal of the foundational trust that open-source ecosystems depend on. The structural reason this persists is that open-source licenses were designed for a world of human-to-human code sharing, not machine learning. The GPL requires derivative works to carry the same license — but courts have not determined whether AI training constitutes creating a 'derivative work.' The MIT license requires attribution in copies — but courts have not determined whether AI-generated code that statistically resembles training data constitutes a 'copy.' These licenses assumed that the entity using the code would redistribute it in recognizable form. AI models break this assumption by ingesting code, learning patterns, and generating output that may be functionally identical but not a literal copy. The legal frameworks that sustained open-source collaboration for 30 years have no answer for this new mode of consumption.

legal0 views

Translation was the first creative profession to be hollowed out by AI. More than a third of translators reported losing work due to generative AI by 2025, with 43% reporting income drops. Some translators have seen earnings fall by 60-80% from their peak. The International Monetary Fund cut its translator and interpreter staff from 200 to 50. One translator told CNN he lost 70% of his income when EU translation work dried up. The American Translators Association reports that translators are leaving the profession in significant numbers. The reason translation collapsed first is that translation has a uniquely verifiable output — a correct translation can be checked against the source, making it easy for clients to validate AI output and decide human translators are unnecessary. But what clients do not realize is that AI translation still fails catastrophically on context, cultural nuance, legal precision, and literary voice. The problem is that these failures are invisible to monolingual clients who cannot read the source language. A marketing team that uses AI to translate campaign copy into Japanese may never know that the translation is tonally wrong, culturally offensive, or grammatically awkward — they just see that it is cheap and fast. By the time the damage shows up in poor market reception or customer complaints, the translators who could have prevented it have already left the profession. This problem persists because the remaining work for human translators is increasingly degrading. Instead of translating from scratch, translators are now hired as 'post-editors' — cleaning up AI output at rates 50-70% lower than original translation rates. This means translators are doing the cognitively hardest part of the work (catching subtle errors that AI misses) while being paid less than ever. The economic incentive to enter the profession has evaporated: translation programs at universities are seeing declining enrollment, and the pipeline of skilled translators for less-common language pairs is drying up. Oxford researchers estimate that roughly 28,000 additional translator jobs would have existed in the US alone without machine translation's impact.

legal0 views

On March 2, 2026, the US Supreme Court denied certiorari in Thaler v. Perlmutter, ending Dr. Stephen Thaler's years-long attempt to register copyright for a piece of visual art autonomously created by his AI system DABUS. The D.C. Circuit had already affirmed human authorship as a 'bedrock requirement' of copyright law. The Copyright Office's January 2025 report reaffirmed that purely AI-generated content cannot be copyrighted, though works created by humans using AI tools may qualify if they embody 'meaningful human authorship.' The practical consequence is devastating for businesses building on AI-generated content. If a company uses AI to generate marketing copy, product images, code, or design assets, those outputs may have no copyright protection at all. Any competitor can freely copy them. This creates an absurd situation: a company might spend thousands of dollars on AI tools and prompt engineering to create content, only to find that the content is effectively public domain. For AI-native businesses — companies whose entire value proposition is built on AI-generated output — this means their core product may be legally unprotectable. This problem persists because copyright law was written with a clear assumption that creation requires a human mind. The Copyright Clause of the US Constitution grants protection to 'authors,' and two centuries of case law has interpreted this to mean human authors. The Copyright Office has tried to draw a line by allowing protection for AI-assisted works with sufficient human creative input, but the threshold for 'meaningful human authorship' is undefined and untested. The result is a spectrum of uncertainty: a human who writes a detailed prompt and then heavily edits AI output may have copyright protection, while a human who writes a one-line prompt and uses the raw output may not. Nobody knows where the line is, and the Supreme Court's refusal to hear the case means there will be no definitive answer from the highest court for years.

legal0 views

Website owners who add AI crawler blocks to their robots.txt files are discovering it makes little difference. The robots.txt protocol is a voluntary convention, not a legal requirement — crawlers can simply ignore it. In December 2025, OpenAI removed robots.txt compliance language from its ChatGPT-User crawler documentation. Multiple AI companies have been documented ignoring robots.txt exclusions entirely. And even when a company's primary crawler respects robots.txt, the content often exists on third-party sites, caches, or data brokers that the original publisher does not control. This matters because it means content creators have zero meaningful control over whether their work becomes AI training data. A photographer who hosts a portfolio site, a journalist who publishes articles, or a developer who maintains technical documentation cannot prevent their work from being ingested. Google compounds the problem: publishers who opt out of Google's AI training crawlers also lose visibility in regular search results, because Google uses the same indexed content for both traditional search and AI Overviews. This creates a coercive dynamic where opting out of AI training means opting out of search traffic — the primary discovery mechanism for most websites. The structural reason this persists is that the internet was built on a model of open access, and robots.txt was designed for a world where the worst-case scenario of ignoring it was getting indexed by a search engine. There is no technical enforcement mechanism — robots.txt is a polite request, not a locked door. Legal remedies are uncertain: the Computer Fraud and Abuse Act was weakened by the 2022 Van Buren Supreme Court decision, and scraping publicly available data may not violate any existing statute. The Really Simple Licensing (RSL) framework launched in September 2025 with 50+ publishers is an attempt to standardize licensing, but adoption is voluntary and does not solve the enforcement gap. Cloudflare launched an AI bot blocking tool in July 2025, but determined scrapers can still circumvent technical blocks.

legal0 views

Voice-over actors Paul Lehrman and Linnea Sage discovered that Lovo, Inc., an AI text-to-speech company, had used recordings they made through Fiverr gigs to train AI voice clones of their voices — clones that Lovo then sold commercially. The actors were originally paid small amounts for what they believed were limited-use recordings. Instead, their voices became products sold to thousands of customers, generating revenue the actors will never see. In July 2025, Judge Oetken in the Southern District of New York ruled that federal copyright and trademark law do not protect a person's voice as an abstract concept. The court dismissed the copyright claims entirely, finding that the actors sought protection for their voice attributes rather than specific recorded performances. This means that under current federal law, there is no remedy for having your voice cloned by AI — your voice is not copyrightable. For the roughly 100,000 voice actors working in the US, this creates an impossible situation: any recording they make for any client could become training data for an AI clone that replaces them permanently. The structural reason this problem persists is that voice acting has always operated in an informal economy. Most voice actors work through platforms like Fiverr, Voices.com, or direct contracts with vague or nonexistent terms about derivative works and AI training. The Lehrman court did allow breach-of-contract claims to proceed, suggesting that platform terms of service might offer some protection — but this requires voice actors to have had explicit contractual restrictions in place, which almost none of them did before 2024. State right-of-publicity laws like New York's digital replica provision and Tennessee's ELVIS Act offer partial protection, but coverage is inconsistent across states, enforcement is expensive, and most voice actors cannot afford litigation against well-funded AI companies.

legal0 views

AI image generators like Midjourney and Stable Diffusion allow anyone to type 'in the style of [artist name]' and produce images that closely emulate a specific living artist's visual style. In the Andersen v. Stability AI lawsuit, the complaint showed that prompting Midjourney with 'gerald brom chef' produced images of fantastical demonic chefs that closely mimicked Brom's signature style and character designs. This is not theoretical — it is happening at scale, and it is destroying the livelihoods of working illustrators. The reason this is devastating is that an artist's style IS their product. A concept artist who spent 15 years developing a recognizable visual language can now be undercut by anyone with a $10/month Midjourney subscription who types their name into a prompt. When a game studio or ad agency can generate 50 images 'in the style of' an artist in minutes rather than commissioning that artist for $5,000, the economic moat that justified years of practice evaporates. The Artists Rights Alliance found in 2025 that 74% of professional visual artists reported lost income directly attributable to clients substituting AI-generated imagery for commissioned work. This problem persists because copyright law protects specific works, not styles. You cannot copyright a style, a color palette, or an aesthetic sensibility — only specific fixed expressions. This means that even though AI companies scraped billions of copyrighted images to learn these styles (the LAION dataset was built by scraping stock photo sites, DeviantArt, Pinterest, and Flickr), the output that mimics an artist's style may not technically infringe on any single copyrighted work. The law was designed for a world where style mimicry required human skill and years of study; it has no framework for machines that can mass-produce style copies at zero marginal cost. Judge William Orrick allowed the Andersen case to proceed in August 2024, but the core legal question of whether AI style mimicry constitutes infringement remains unanswered.

legal0 views

Streaming platforms like Spotify and Deezer now receive tens of thousands of AI-generated tracks per day. Deezer estimated in April 2025 that 18% of daily uploads — roughly 20,000 tracks — are AI-generated. Fraudsters use AI song generators like Suno and Udio to mass-produce tracks, then stream each one just enough times to collect royalties without triggering fraud detection. The Michael Smith case, where a North Carolina man allegedly used AI to generate hundreds of thousands of songs and earn over $10 million in fraudulent royalties, is just one known example. This matters because streaming royalty pools are finite. Every dollar paid out to a fake AI track is a dollar taken from a real musician. Estimates suggest over $1 billion per year is being extracted from legitimate artists' royalty pools. For independent musicians who depend on streaming income to survive — many earning fractions of a cent per stream already — this dilution is existential. A bedroom artist who gets 50,000 genuine streams per month might see their per-stream rate drop because the total pool is being divided among millions of fraudulent tracks. This problem persists because the economics of streaming platforms incentivize volume over quality. Platforms earn from subscriptions regardless of what gets streamed, so they have weak incentives to aggressively filter AI slop. The DMCA's safe harbor protections shield platforms from liability as long as they respond to takedown notices, but there is no practical way for individual musicians to file DMCA claims against millions of anonymous AI-generated tracks. The detection technology exists but lags far behind the generation technology, and platforms are reluctant to invest heavily in filtering when AI-generated content increases their total catalog numbers — a metric they use to attract investors and subscribers.

legal0 views

Fisheries management in the United States is built on stock assessments — scientific estimates of how many fish are in the water, how fast they reproduce, and how much can be safely harvested. The Magnuson-Stevens Act requires that catch limits be based on the 'best scientific information available.' The problem is that for many economically important fisheries, the best available science is years or even decades old. Some stocks have never been formally assessed because there is not enough data — no landings records, no survey data, no biological samples. For stocks that have been assessed, updates happen infrequently due to resource constraints, meaning catch limits may be based on population estimates from five or ten years ago. This creates a specific, maddening situation for fishermen: they can see abundant fish on their sonar and in their nets, but they are legally prohibited from catching them because the quota is based on an old assessment that showed lower abundance. Conversely, when stocks decline between assessments, fishermen may be allowed to catch more than the population can sustain, leading to the overfishing that the system is designed to prevent. Congressional hearings have documented that 'lack of accurate, up-to-date data for numerous economically vital fisheries has caused significant problems' and that NOAA 'has proceeded to implement provisions in a manner that ignores profound shortfalls in requisite data.' The economic cost is real. When catch limits are set too low relative to actual abundance, fishermen leave money on the table — fish that could be sustainably harvested go uncaught, and the communities that depend on fishing income suffer. When limits are set too high, the long-term damage to fish stocks costs future fishing revenue. In either case, the fisherman bears the consequences of data the agency failed to collect or update. Due to the timing of stock assessments, NOAA itself acknowledges that 'it may take several years before we are able to determine if catch limits successfully ended overfishing.' This problem persists because stock assessments are expensive and labor-intensive. Each assessment requires dedicated research vessel time, biological sampling, data analysis, and peer review. NOAA's budget for fisheries science has not kept pace with the number of stocks requiring assessment or the increasing complexity of the models used. The agency manages over 500 fish stocks and stock complexes, but only a fraction receive regular assessments. The structural issue is that the Magnuson-Stevens Act mandates science-based management but does not mandate the funding to produce the science. The result is a system that demands precision from a data foundation full of gaps.

agriculture0 views

About 12% of U.S. commercial fishermen engage in some form of direct marketing — selling catch at the dock, at farmers markets, through community-supported fishery (CSF) shares, or online. For fishermen receiving $1–3 per pound at the dock from wholesalers, direct sales at $8–15 per pound represent a transformative increase in revenue. But the regulatory path to legal direct sales is so complex that most fishermen never attempt it, and those who do spend months navigating a patchwork of federal, state, and local requirements. The barriers are concrete and specific. Selling seafood directly to consumers typically requires a separate retail or direct marketing license on top of existing commercial fishing permits. Many states require that fish be processed in a licensed facility — even if 'processing' just means filleting and vacuum-sealing. Building or renting a licensed processing space costs tens of thousands of dollars. Health department requirements for temperature logging, HACCP plans, and labeling add administrative burden that a single fisherman working off a 40-foot boat is not equipped to handle. In some jurisdictions, fishermen are flatly prohibited from selling directly from the vessel at the dock — an absurd restriction given that this is the freshest possible point in the supply chain. The consequence is that the vast majority of fishermen remain locked into selling to a handful of wholesale buyers at commodity prices. The middlemen — processors, distributors, and retailers — capture most of the value chain. A fisherman might receive $1.50/lb for sockeye salmon that sells for $25/lb at a grocery store. The fisherman takes all the physical risk, operates in the most dangerous occupation in America, and captures 6% of the retail value. This economic structure is a major driver of the 'graying of the fleet' — young people look at the economics and rationally choose other careers. This problem persists because food safety regulations were designed for industrial-scale processing facilities, not for individual fishermen selling small quantities of fresh, whole, or simply processed fish. Regulators apply the same framework to a fisherman selling 50 pounds of halibut at a farmers market as they do to a factory processing millions of pounds. There is no federal 'cottage food' exemption for seafood equivalent to what exists for baked goods and preserves in many states. The regulatory asymmetry means that the safest, freshest, most traceable seafood — fish sold directly by the person who caught it — faces higher regulatory hurdles per pound than industrial seafood that passes through five intermediaries.

agriculture0 views

Commercial fishing workers are approximately 28 times more likely to die on the job than the average American worker, making it consistently one of the deadliest occupations in the United States according to Bureau of Labor Statistics data. Falls overboard, vessel disasters, and diving incidents account for the majority of fatalities. Beyond deaths, fishing injuries are massively underreported — even the largest fishing companies in the U.S. routinely fail to comply with regulations requiring reporting of lost-time accidents, making it impossible to compute accurate injury rates for the industry. The regulatory landscape is a jurisdictional mess that leaves fishermen in a gap. OSHA has authority over workplace safety on land, and the Coast Guard has authority over vessel safety at sea, but most commercial fishing vessels are classified as 'uninspected vessels' — meaning they are not subject to the comprehensive safety standards that apply to inspected commercial vessels like cargo ships or ferries. There are far too few regulations relating to fishing vessel safety for uninspected commercial fishing vessels, and the Coast Guard has far too few resources to inspect and enforce even the basic safety regulations that do exist. A restaurant kitchen gets regular health and safety inspections; a fishing vessel where people die at 28x the national rate may go years without one. The human cost is concentrated in specific communities. When a fisherman dies or is permanently disabled, the economic impact cascades through a small town: a family loses its primary income, a boat loses its most experienced crew member, and the community loses institutional knowledge about local fishing grounds, weather patterns, and seamanship that took decades to accumulate. Unlike most industries, commercial fishing has no mandatory workers' compensation in many states, and the Jones Act — which governs maritime injury claims — requires fishermen to prove employer negligence, a higher bar than standard workers' comp. This problem persists because the commercial fishing fleet is fragmented across thousands of small, independent operators who lack the resources to invest in modern safety equipment and training. There is no industry-wide safety culture analogous to what exists in commercial aviation or oil and gas. The Coast Guard's commercial fishing vessel safety program is chronically underfunded relative to the size and geographic spread of the fleet. A 2023 GAO report found that the Coast Guard needed to take additional actions to improve commercial fishing vessel safety efforts. The fundamental structural issue is that fishing is treated as an inherently dangerous occupation where risk is accepted rather than engineered away, and the regulatory framework reflects this fatalistic assumption.

agriculture0 views

The Saltonstall-Kennedy Act of 1954 established a fund using tariffs collected on imported seafood. The explicit purpose was to 'promote and develop' U.S. fisheries — fund marketing campaigns, support economic development in fishing communities, and help domestic seafood compete against imports. In practice, NOAA has for years transferred the bulk of this revenue into its general Operations, Research, and Facilities account to fund its own science and management activities. The money that was supposed to help fishermen sell their product is instead paying for the agency that regulates them. The practical impact is that U.S. commercial seafood has no equivalent of the beef industry's 'Beef: It's What's for Dinner' or the pork industry's marketing campaigns, which are supported by USDA checkoff programs. American fishermen catch some of the highest-quality, most sustainably managed seafood in the world, but there is no national campaign to tell consumers about it. Meanwhile, imported farmed shrimp from India and Indonesia — produced with lower labor and environmental standards — dominates grocery store shelves because it is cheap and heavily marketed by foreign producers. U.S. fishermen watch their own tariff dollars get redirected while they lose market share. The dollar amounts are significant. The S-K fund collects roughly $350 million per year in import duties. Only a fraction — historically around $10–15 million — goes to competitive grants for fisheries research and development. The rest disappears into NOAA's general budget. Fishermen have argued for decades that if even a portion of these funds were used as intended — to market domestic seafood, develop new products, open new markets — it could meaningfully shift demand toward U.S.-caught fish and improve ex-vessel prices. This problem persists because NOAA has an institutional incentive to keep the money: its budget is always under pressure, and the S-K funds are a convenient supplement. Congress has periodically pushed back with appropriations language restricting the use of these funds, but enforcement is weak and the money continues to flow into general operations. The fishing industry lacks the political weight of agriculture — there are only about 170,000 commercial fishing jobs in the U.S., compared to millions in farming — so there is no powerful lobby to force the reallocation. The structural issue is that the agency charged with managing fisheries is also the agency that benefits from diverting the funds, creating an irreconcilable conflict of interest.

agriculture0 views

The United States imported an estimated $2.4 billion worth of seafood derived from illegal, unreported, and unregulated (IUU) fishing in 2019 — representing nearly 11% of total U.S. seafood imports and over 13% of imports caught at sea. Globally, IUU fishing accounts for about 20% of total catch, and in some regions it reaches 50%. The top sources of IUU imports to the U.S. are China, Russia, Mexico, Vietnam, and Indonesia. For domestic fishermen, IUU imports create an impossible competitive dynamic. A U.S. fisherman operating legally must comply with catch limits, gear restrictions, seasonal closures, observer requirements, and reporting mandates — all of which cost money and constrain output. An IUU operator in another country ignores all of these rules, catches as much as they want with whatever gear they want, and sells the product into the same U.S. market at lower prices. The legal fisherman absorbs the costs of sustainability; the illegal operator free-rides on the resource and undercuts on price. This is not a theoretical concern — the Southern Shrimp Alliance has documented how dumped, illegally caught foreign shrimp has driven down prices for U.S. wild-caught shrimp to the point where Gulf Coast shrimpers cannot cover their operating costs. The ecological damage compounds the economic harm. When IUU-caught fish enters the market alongside legally caught fish, it inflates apparent supply and depresses prices, which reduces the economic incentive for legal fishing. Meanwhile, the IUU catch itself is depleting fish stocks that legal fishermen depend on. It is a double hit: less fish in the water and lower prices at the dock. This problem persists because the U.S. Seafood Import Monitoring Program only covers 13 species groups, leaving the majority of imported seafood unmonitored. Tracing the origin of seafood through complex global supply chains — where a fish caught in one country may be processed in a second, repackaged in a third, and sold in a fourth — is technically difficult and resource-intensive. Customs and Border Protection lacks the capacity to inspect more than a small fraction of seafood shipments. The fundamental structural issue is that the U.S. imports 70–85% of its seafood but has no comprehensive system to verify that those imports were legally caught.

agriculture0 views

Independent fishermen in rural coastal communities describe shore-side infrastructure as their number one concern — not boats and permits, but the ice machines, cold storage units, fuel docks, and processing lines that make it physically possible to land fish in marketable condition. Over the past two decades, these facilities have been steadily disappearing. Local ice machines get turned off. Processing lines are mothballed. Commercial docks erode and are not repaired, or are redeveloped into condos and marinas. When the last facility in a community closes, fishermen face a stark choice: steam hours to the next port (burning fuel and degrading catch quality), or stop fishing. The immediate consequence is economic: fish that sits for hours without ice loses grade and value. A premium sushi-grade tuna becomes commodity-grade. Fresh halibut becomes frozen halibut. The price difference can be 50% or more. For a fisherman operating on thin margins, the inability to ice and store catch properly can be the difference between a profitable trip and a loss. The secondary consequence is competitive: when fishermen are forced to consolidate into fewer, larger ports, those ports gain monopsony power — fewer buyers competing for catch means lower prices paid to fishermen. The community-level impact is existential. Working waterfronts are the economic backbone of hundreds of small coastal towns. When the dock closes, the fuel supplier loses a customer. The marine supply store closes. The restaurant that served local catch switches to frozen imports. Young people leave. The tax base shrinks. The cycle is self-reinforcing: less infrastructure leads to fewer fishermen, which leads to less demand for infrastructure, which leads to more closures. This problem persists because maintaining cold storage and dock infrastructure in a corrosive marine environment is expensive, and the revenue from serving a shrinking fleet of small boats does not justify the capital investment. Unlike farms, which benefit from USDA Rural Development grants, rural fishing communities have no equivalent federal infrastructure program. The Fishing Industry Credit Enhancement Act — which would allow cold storage providers serving fishing operations to access Farm Credit System loans — has been introduced in Congress but has not passed. The structural gap is that American agricultural policy has spent a century building support systems for farmers, ranchers, and loggers, but has largely ignored fishermen.

agriculture0 views

A meta-analysis of U.S. seafood studies found that 39.1% of samples were mislabeled, with 26.2% involving outright species substitution — cheap fish sold as expensive fish. Tilapia is sold as red snapper. Farmed salmon is labeled as wild-caught. A February 2026 FAO/IAEA report estimated that roughly 20% of aquatic products globally are intentionally mislabeled. NOAA's own inspectors, who see about one-fifth of U.S. seafood, find fraud in up to 40% of products submitted to them. For an honest fisherman catching wild-caught red snapper in the Gulf of Mexico, this is devastating. They spend $3–5 per pound in operating costs to catch real snapper. A fraudulent supplier relabels tilapia — which costs $1–2 per pound wholesale — as snapper and sells it at snapper prices, pocketing the margin. The restaurant or grocery store buyer cannot tell the difference visually, so the fraudster undercuts the honest fisherman on price while still charging premium prices to the end consumer. The honest fisherman loses the sale. Multiply this across millions of pounds and thousands of transactions, and the cumulative effect is that legitimate domestic fishermen are competing against a phantom market of misrepresented product. The downstream consequences extend beyond economics. When consumers unknowingly buy mislabeled fish, they may consume species with different allergen profiles, mercury levels, or sustainability statuses. When a 'sustainably caught' label is slapped on illegally harvested fish, it undermines the entire market signal that is supposed to reward responsible fishing practices. Fishermen who invest in sustainable gear, follow catch limits, and absorb regulatory costs get zero market premium because the labels are unreliable. This problem persists because the U.S. Seafood Import Monitoring Program (SIMP) only covers 13 species groups — a fraction of the hundreds of species entering the country. DNA testing to verify species is available but not routinely required at point of import or sale. The supply chain is long and opaque: a fish may pass through five or more intermediaries between the boat and the plate, and each handoff is an opportunity for relabeling. There are over 1,800 species of seafood sold in the U.S., making comprehensive monitoring practically impossible with current resources.

agriculture0 views

Under catch share programs (also called Individual Fishing Quotas or IFQs), the right to catch fish is allocated as tradeable quota. In theory, this creates efficient markets and prevents overfishing. In practice, quota has consolidated into the hands of large corporations and absentee investors who lease it back to working fishermen at rates that consume 20–25% of the value of fish caught. In the Gulf of Mexico red snapper fishery, leasing costs run about $4.50 per pound. A fisherman who catches 10,000 pounds of red snapper at $5/lb grosses $50,000 but pays $45,000 in quota lease fees alone — before fuel, crew, bait, ice, insurance, and maintenance. The result is a class of fishermen who own their boats, maintain their gear, risk their lives at sea, and do all the physical labor of fishing, but take home a fraction of the catch value because the right to fish has been financialized. Young fishermen who want to enter the industry face an impossible barrier: a Bristol Bay drift permit in Alaska costs around $130,000, with annual lease fees of $14,000–$15,000. A New England groundfishing permit starts at $30,000 minimum, and that is before buying a boat ($40,000+ for a small one), mooring it ($25,000), insuring it ($5,000), and buying gear ($10,000+). The total startup cost to become a commercial fisherman now exceeds $200,000 in many fisheries — comparable to buying a house, but with far less predictable income. This matters because it is destroying the demographic pipeline of commercial fishing. The average age of an Alaska state commercial fishing permit holder has risen from 40 in 1980 to about 50 today. In New England, the average age of groundfish and lobster captains is 55. When these fishermen retire, there are not enough young fishermen to replace them, because the financial barriers are too high. NOAA itself has flagged this 'graying of the fleet' as a threat to national food security. The problem persists because quota systems were designed by economists focused on preventing the 'tragedy of the commons' without considering the distributional consequences. Once quota becomes a tradeable asset, market forces inevitably concentrate it. There is no federal mechanism to prevent absentee ownership of fishing rights, no cap on lease rates, and no program equivalent to USDA beginning farmer loans specifically for new fishermen. The Young Fishermen's Development Act, signed in 2021, allocated only $2 million per year — a rounding error compared to the scale of the problem.

agriculture0 views

The National Marine Fisheries Service requires certain commercial fishing vessels to carry federal observers — trained monitors who ride along on fishing trips to count catch, measure bycatch, and ensure compliance. Under the Atlantic herring fishery management plan, vessel owners are required to fund these observers themselves, at a cost NMFS estimated at up to $710 per day per observer. For small, family-owned herring boats grossing $300,000–$500,000 per season, this represents up to a 20% reduction in annual returns. The financial pain is acute because these are not optional costs that fishermen can plan around. NMFS can mandate observer coverage on any given trip, and the vessel owner must pay regardless of how much fish they catch that day. A bad weather day, a mechanical issue, a poor fishing day — the observer still gets paid $710. For boats between 40 and 58 feet, carrying an observer also means leaving behind a crewman or changing fishing strategy, because there is physically no room for an extra person. So the mandate simultaneously increases costs and decreases productivity. This issue became the basis for Loper Bright Enterprises v. Raimondo, the 2024 Supreme Court case that overturned the Chevron deference doctrine — one of the most consequential administrative law rulings in decades. The fishermen argued that the Magnuson-Stevens Act does not authorize NMFS to force vessel owners to pay for observers. The Supreme Court agreed that courts should not defer to agency interpretations of ambiguous statutes. But the ruling did not eliminate the observer mandate itself — NMFS is still pursuing rules to require herring boats to carry observers, and the question of who pays remains unresolved. The problem persists because Congress has never appropriated enough money to fund observer coverage at the levels NMFS wants. Rather than scaling back coverage to match its budget, NMFS shifted costs to the industry. Small-boat fishermen lack the lobbying power to fight this in Congress, and the regulatory process moves slowly enough that by the time a rule is challenged, fishermen have already absorbed years of costs. The structural issue is a mismatch between the agency's monitoring ambitions and the economic reality of small-scale commercial fishing.

agriculture0 views