Crime and Code: The Benefits and Challenges ofUsing AI in Law Enforcement InvestigationsCrime and Code: The Benefits and Challenges ofUsing AI in Law Enforcement Investigations

AI regulation is a top concern for law enforcement, as policymakers dance a fine line between supporting the safe use of this technology without hindering innovation.

Lt. Eric Kinsman, Commander, New Hampshire internet crimes against children task force

January 10, 2025

4 Min Read
A law enforcement digital team
Getty Images

Today, law enforcement agencies are seeing an increase in artificial intelligence (AI)-related crime. With 90% of cases today involving at least one digital element, and the average personal device containing hundreds of thousands of messages, images and videos, many law enforcement teams are now leaning into the same kind of tech to quickly analyze high volumes of data, identify patterns in evidence and accelerate investigations.  

These AI-powered investigations have been particularly impactful with cases of internet crimes against children (ICAC), especially those involving child sexual abuse material (CSAM) where predators are using the technology to create alarmingly realistic explicit deepfake images and videos of children.

The Benefits of AI In Investigations

Currently, 61% of law enforcement see AI as a valuable tool for digital forensics and investigations. These agencies are leveraging the technology to address challenges associated with rising case backlogs and limited staff while unlocking deeper, more actionable insights.  

  • Accelerating investigations: AI can analyze and sift through evidence instantaneously to find what is critical to a case and provide a contextual summary – expediting tasks that traditionally have taken days to complete.

  • Developing pattern recognition: AI can improve pattern recognition and anomaly detection, which can be vital in cases featuring large volumes of digital evidence.

  • Alleviating case backlogs: Law enforcement teams are increasingly stretched thin, with 69% of professionals feeling as if they do not have enough time to review all the data in their cases. Examiners face an average of three to four weeks of case backlogs due to the increased amount of data involved in digital investigations. AI can help automate tedious tasks, freeing up these officials’ time to better support all cases.

Related:Why Some Companies Thrive with AI While Others Fall Behind

Challenges, Limitations and Obstacles to Using AI in Investigations 

While the benefits of these tools are promising, their implementation isn’t without challenges and potential limitations.

AI regulation is a top concern for law enforcement, as policymakers dance a fine line between supporting the safe use of this technology without hindering innovation. 60% of law enforcement professionals believe the use of AI will be limited by regulations and procedures, and more than half are concerned about the implications of these regulations on their AI implementation, according to Cellebrite’s annual Industry Trends survey.

 

Many law enforcement professionals are also concerned with AI’s potential to replace their jobs. However, human investigators will be essential for data review and verification while these technologies manage the near-impossible high volume of digital evidence in today’s cases.

Related:AI’s New Wave: Great Spaceships, Bumpy Runways

Best Practices for Organizations Looking to Deploy AI for Investigations

As teams implement AI in investigations, there are some best practices professionals can follow to use the technology optimally.

  • Skills training for staff: AI is not a replacement for human teams. Staff need to be trained on how to correctly and ethically use the tech. This can come in the form of training events, such as capture-the-flag competitions. Additionally, organizations like the International Association of Chiefs of Police (IACP) hold training sessions and conferences to educate the industry. These hands-on experiences with the latest AI-powered tech for digital investigations help ensure teams are ready when their agency introduces it into the workflow. Sound training also helps ensure the admissibility of digital evidence in court.

  • Keep a human in the loop: While AI promises more efficient investigations, organizations must ensure human oversight is still present to verify any of the tech’s findings before it’s used in investigations. While many have become more proficient in AI, executives estimate up to 40% of the workforce needs to reskill as the tech becomes more prevalent. While not all agencies have access to the same technology, it’s important to stay up to date through conferences, training seminars and digital forensics courses.

  •  Selecting the right tool: Whether for law enforcement or the enterprise, organizations must ensure their AI solutions meet their unique workflow needs. With the rising volume of digital evidence, these tools must be capable of automating investigators’ most tedious work and providing actionable insights to help accelerate investigations. For agencies handling crimes involving CSAM, the tools they select should also be designed to reduce their exposure to traumatic content through automated evidence categorization.

As AI evolves, the private and public sectors must adapt and embrace responsible AI in their workflow. Before fully utilizing this new tech, organizations must consider their unique situations, select the right ethical tool for their needs and train their staff accordingly to improve efficiency and unlock better outcomes.

About the Author

Lt. Eric Kinsman

Commander, New Hampshire internet crimes against children task force, New Hampshire internet crimes against children task force

Lt. Eric Kinsman Commander of the New Hampshire internet crimes against children task force.

Sign Up for the Newsletter
The most up-to-date AI news and insights delivered right to your inbox!

You May Also Like