This is an illustrative scenario based on common patterns we see across SMBs. Not a real client engagement. Names, figures, and details are constructed to reflect typical situations in this industry.
Common starting state
A property management firm with around 75 people—leasing agents, maintenance coordinators, an accounting team, and a handful of property managers—is fielding pressure from the ownership group to "use AI." Staff have already started experimenting on their own: several leasing agents use ChatGPT to draft rejection and approval letters, the accounting lead summarizes vendor invoices with it, and at least one property manager pastes tenant complaint emails into it to generate responses.
No written policy exists. No one has inventoried which accounts are personal (free plan) versus business. The firm handles tenant PII daily: Social Security numbers during application screening, bank account details for rent payments, and medical accommodation requests under fair housing law.
Risks identified
An Audit in this scenario typically surfaces three or four significant findings:
Uncontrolled data flow. Free-tier ChatGPT accounts mean tenant PII is likely flowing into training pipelines. SSNs from rental applications are the most common critical finding. At least one person on the team has almost certainly pasted an application form in full.
Fair housing exposure. Lease rejection letters generated by AI without human review are a liability. If the AI's language patterns produce outputs that correlate with protected classes (even inadvertently), the firm faces fair housing risk on top of the data risk.
Vendor contracts without AI terms. The software stack for a property management firm typically includes a property management platform, a maintenance ticketing system, and a tenant portal. None of their vendor contracts from three years ago address AI features—yet two of those vendors have added AI summaries and chat features to their products, quietly.
No incident path. If a tenant's data were mishandled, no one knows who to call, what to preserve, or what disclosure timeline they're working against.
What we'd typically recommend
For a firm this size, recommendations focus on quick-win controls and a short written policy before any tool expansion:
-
Upgrade to ChatGPT Team for the handful of staff using it regularly, and turn off all personal accounts for business use. Cost at the time of writing: approximately $25/user/month for the users who actually need it. For a 10-person subset, that's $250/month—cheap against the risk.
-
Define the data perimeter explicitly. A one-page policy listing what can and cannot go into AI tools, distributed at the next all-hands. SSNs, bank account details, and medical accommodation requests are the three explicit prohibitions for this industry.
-
Add AI terms to vendor review checklist. When any vendor contract comes up for renewal, the review should now include: does this vendor use our data to train models, what is their AI feature data-handling policy, and do they offer a BAA if we need HIPAA-adjacent coverage?
-
Pilot AI in one low-risk workflow. Rather than a broad rollout, identify one workflow that uses no PII—say, drafting maintenance update templates or writing property description copy for listings—and run the pilot there for 60 days. Build confidence and process before expanding.
-
Document an incident path. One paragraph: if you think you've shared something you shouldn't have, contact this person within 24 hours. That's enough to start.
Outcome to expect
Firms in this scenario typically reclaim meaningful time in the workflows they successfully pilot. Lease communication drafts that took 20 minutes of back-and-forth are down to 5 minutes of review-and-send. Maintenance update templates get standardized. The accounting lead stops summarizing invoices in ChatGPT (which was the higher-risk use) and uses the firm's accounting software's built-in summarization instead, which has contractual data protections.
The 18 hours per week figure in the scenario summary reflects that kind of shift: not a single dramatic automation, but time savings distributed across the leasing team and admin staff, in the workflows where AI is appropriate.
The more important outcome: the firm now has a documented position. If a tenant ever asks "do you use AI in your operations," there's an honest answer backed by policy.