ScalarOps
ScalarOpsWorkforce Architects
Back to Insights
Case StudiesJanuary 5, 20255 min read

Automating Document Processing: Lessons from Logistics

A logistics company in the Netherlands came to us last year with a familiar problem. Their operations team was spending most of their day copying information from one system to another. Bills of lading would arrive by email. Someone would open the PDF, find the relevant fields, and type them into the ERP. Then they'd do it again. And again. Forty times a day, per person.

The actual problem

The company moved freight across Europe. Each shipment generated paperwork: bills of lading, customs declarations, carrier confirmations, delivery receipts. The documents arrived in different formats from different sources. Some were structured PDFs. Some were scanned images. Some were email bodies with the details buried in prose.

The operations team had six people. Four of them spent roughly half their day on data entry. They'd extract shipment numbers, weights, origins, destinations, and dates, then enter everything into their transport management system.

The work wasn't hard. It was just tedious and endless. The team's experienced coordinators, people who knew which carriers were reliable and which routes had problems, were spending their expertise on copy-paste tasks.

They'd tried OCR software before. It worked on clean PDFs but choked on scanned documents and couldn't handle the variability in formats. They'd also tried hiring temps, but training took weeks and error rates were high.

What we built

We deployed a local LLM trained to extract structured data from logistics documents. The model ran on a server in their office. Documents never left their network.

The system worked like this: documents arrived by email. An automated process sorted them by type. The LLM extracted the relevant fields and formatted them for the ERP import. A human reviewed the extractions before they went into the system.

The human review was intentional. We didn't try to eliminate people from the process. We tried to change what they spent their time on. Instead of typing data, they verified it. Instead of forty documents per day, they reviewed forty extractions per day. The work was faster and caught more errors.

The model wasn't perfect. It struggled with handwritten annotations and occasionally misread faded scans. But it was right about 85% of the time on the first pass. The review step caught the rest.

The results after six months

Data entry time dropped by 40%. The four people who had been doing half-day entry shifts were now doing two-hour review shifts. The rest of their time went back to actual logistics work: coordinating with carriers, solving delivery problems, improving routes.

Error rates went down, not up. The review step caught mistakes the old manual process had missed. Shipments stopped getting lost because someone fat-fingered a postal code.

The team's morale improved. This sounds soft, but it mattered. The experienced coordinators felt like they were doing their actual jobs again. One of them told us it was the first time in years she'd had time to call carriers proactively instead of just reacting to problems.

The system paid for itself in four months. Hardware, setup, and our fees totaled about what they'd been spending on temporary staff each quarter. After that, the savings were pure margin.

This wasn't a moonshot project. We didn't transform their business or replace their team. We just took a specific, annoying task and made it less annoying. The lesson is that useful AI often looks like this: small, targeted, and designed to work with humans rather than replace them.

Ready to automate your operations?

Let's identify where AI can have the biggest impact on your business.