Mass Appraisal Algorithms Give Homeowners No Explanation for Their Assessed Value
financefinance0 views
Most U.S. counties now use Computer Assisted Mass Appraisal (CAMA) systems that algorithmically assign values to hundreds of thousands of properties at once. When a homeowner receives their assessment notice and asks 'why is my home valued at $X?', the answer is effectively 'because the model said so.' The IAAO's own standards explicitly state that a property owner should never be told 'the computer produced the appraisal,' yet in practice that is exactly what happens. A homeowner trying to appeal has no way to understand which variables the model weighted, which comparable sales it used, or why it valued their home differently from their neighbor's. They're fighting a black box. The structural cause is that CAMA vendors sell proprietary software with opaque formulas, assessor staff often don't fully understand the models themselves, and there is no legal requirement in most states for assessors to disclose the specific inputs and weights that produced an individual valuation.
Evidence
The IAAO Standard on Mass Appraisal of Real Property explicitly warns against telling taxpayers 'the computer did it' (iaao.org). MOST Policy Initiative's science note on AI and property assessments documents algorithmic bias risks and opacity in mass appraisal. Catalis (CAMA vendor) acknowledges that legacy tools contain 'opaque formulas' and markets transparency as a differentiator. Europe's GDPR establishes a 'right to explanation' for automated decisions, but no equivalent exists in U.S. property tax law. The IAAO's Guidance on Developing Mass Appraisal notes that all models require human oversight but provides no enforcement mechanism for transparency.