Sears Home Services AI Chatbot Leak Exposes 3.7 Million
Sears Home Services AI Chatbot Leak Exposes 3.7 Million Customer Records – What Happened and Why It Matters
KoreWealth
4/13/20263 min read


Sears Home Services AI Chatbot Leak Exposes 3.7 Million Customer Records – What Happened and Why It Matters
Sears Home Services is the nation’s largest appliance repair and home services provider. They handle repairs, maintenance, installation, HVAC (heating & cooling), home cleaning (carpet/air duct), parts, and home warranty plans (Sears Protect) for all major brands — even if you didn’t buy them from Sears. I just want you to see everyday how exposed networks and data are, the reason for obtaining knowledge which prompt awareness.
In March 2026, a significant data exposure at Sears Home Services revealed critical vulnerabilities in AI-powered customer service systems. Security researcher Jeremiah Fowler discovered three publicly accessible, unencrypted databases containing approximately 3.7 million customer service records spanning 2024 through early 2026. The exposed files included chat logs, audio recordings, transcripts, and spreadsheets—totaling over 4.3 terabytes of data.
What Was Exposed?
The databases were tied directly to Sears Home Services’ AI customer support tools:
Samantha: The customer-facing AI virtual voice agent and chatbot used for phone calls, online chats, scheduling, and support.
kAIros (KAIros): Sears’ broader AI ecosystem for appointment scheduling and operational support.
Key details from the exposed data include:
3.7 million chat logs and transcripts (including one CSV file with 54,359 complete chat logs).
Over 1.4 million audio recordings (some up to 4 hours long, continuing to record ambient household sounds, TV audio, or private conversations even after customers believed the call had ended).
More than 2.1 million text files with scheduling conversations.
Over 200,000 spreadsheet logs (XLSX files).
Personally identifiable information (PII): Customer names, phone numbers, home addresses, email addresses, appliance details, repair/delivery appointments, and account information.
Conversations in both English and Spanish.
The files were stored in plaintext with no password protection or encryption, making them freely accessible to anyone who found the databases.
How the Breach Was Discovered and Resolved
Fowler identified the exposed databases in early February 2026 while conducting routine security research. He immediately issued a responsible disclosure notice to Transformco, the parent company of Sears and Sears Home Services. The databases were secured the following day and are no longer publicly accessible. Transformco acknowledged the notice but provided no further public comment or details on the incident.
No evidence suggests the data was accessed maliciously before discovery, but the exact duration of the exposure remains unknown.
Why This Matters: The “Shadow AI” Risk
This incident is a textbook example of Shadow AI risks—the dangers that arise when AI tools are deployed without proper oversight, governance, or security controls from central IT/security teams. Sears Home Services (which still performs over 7 million appliance repairs annually) relied heavily on AI for customer interactions to modernize operations. However, the misconfigured databases show how quickly customer data can become exposed when AI systems log sensitive interactions without adequate safeguards.
Broader implications include:
Privacy and fraud risks: Scammers could use the PII for phishing, identity theft, or targeted social engineering. The voice recordings represent biometric data that could theoretically be used for voice cloning (though Fowler emphasized this is a hypothetical risk, not a confirmed one).
Customer trust erosion: People interacting with “Samantha” expected privacy, but hours of post-call audio potentially captured unrelated private conversations.
AI system exposure: Logs could reveal internal prompts, guardrails, and workflows, helping attackers probe for weaknesses.
Experts note that as companies rush to integrate generative AI into customer service, many overlook basic data protection—exactly the type of shortcut that turns innovation into a liability.
What Affected Customers Should Do
If you interacted with Sears Home Services via chat or phone between 2024 and early 2026:
Monitor your accounts for unusual activity.
Be wary of unsolicited calls, texts, or emails claiming to be from Sears (or referencing your repair/delivery details).
Watch for phishing attempts that reference specific appliance or appointment information.
Consider placing a fraud alert on your credit reports and enable two-factor authentication everywhere possible.
Use official Sears channels for any follow-up.
Lessons for Businesses Using AI This case underscores the need for:
Proper configuration and access controls on all AI-related databases.
Encryption and password protection as baseline requirements.
Regular audits of “Shadow AI” tools (unofficial or poorly governed AI deployments).
Clear policies on data retention, especially biometric/audio recordings.
Giving customers the option to opt out of AI interactions or recordings.
References
expressvpn.com
wired.com
Sources for further reading
Jeremiah Fowler’s full report on ExpressVPN: Primary disclosure.
WIRED coverage (March 17, 2026).


