How New AI Technology Can Hurt People With a Criminal Record and How a Lawyer Can Help
- Brinkley Law

- Jul 18, 2025
- 2 min read
Artificial intelligence (AI) is now used almost everywhere by employers, landlords, banks, and even stores. It helps people make fast decisions, but if you have a criminal record, it could be making the wrong ones about you.
In 2025, government agencies warned companies not to let AI unfairly block people from jobs, housing, or credit. But the truth is, many AI systems still pull data from old public records. That means someone with a cleared or minor record could still get denied without a fair shot. Often, you won't even know AI was used to reject you.
There’s more. Some stores and businesses are using AI facial recognition tools to track shoplifting. But these systems can be wrong. In fact, the Federal Trade Commission (FTC) just acted against a company for falsely tagging innocent people as criminals using this kind of technology.
In Indiana, lawmakers also passed a new law in 2025 making it a crime to create fake images (like deepfakes) that insert someone’s face into explicit content. These AI-generated images can cause serious harm, especially for people who are trying to rebuild their lives after a past mistake.
Here’s where a lawyer can help:
Make sure AI is not using sealed or incorrect records against you
Challenge unfair background checks
Demand records and corrections from credit and data companies
Defend your rights if you're wrongly flagged by a store or app
If you’ve been denied a job, housing, or loan and you think your record or a mistake in the system played a part - contact Brinkley Law today. Brinkley Law offer’s confidential case reviews and can help you clear your name and protect your future. Don’t let bad data make big decisions for you.




Comments