Ships hurled onto a highway by Hurricane Katrina in 2005. Image courtesy Robert Kaufmann/FEMA
Tens of thousands of people who applied for assistance from the federal government following hurricanes Katrina and Rita improperly received payments they weren’t entitled to totaling $632 million, says a new report from the Department of Homeland Security’s watchdog inspector general.
Disaster victims apply for assistance from the Federal Emergency Management Agency to cover everything from temporary housing and home repair to funeral expenses and moving costs until their lives return to normal. The federal government handed out over $7 billion from just one of these programs after Katrina and Rita. But the inspector general found that according to FEMA’s own estimates, hundreds of millions worth of payments “were improper due to inadequate internal controls, human error, mistake and fraud.”
So what’s happening with the money if FEMA knows it needs to be recouped? A new process for doing that apparently hasn’t been approved. From a statement directed at FEMA:
Given the volume and significance of these improper disaster assistance payments and the [Obama] administration’s current effort to cut the billions of dollars wasted each year in improper payments, we recommend that you promptly authorize the collection of this debt.
The super-secret National Security Agency broke ground on a $1.2 billion “Spy Center” this month in Utah where data collected across the nation’s intelligence community will be kept. A building company involved with the project was told not discuss major details publicly. But the massive one million-square-foot project is expected to significantly buoy Utah’s struggling construction industry.
Almost 10,000 people will be needed over the next few years to help set up the spy center, and it’s currently the largest Defense Department project in the nation. Who’s jazzed about those jobs? Why Utah politicians, of course, notably Republican Sen. Orrin Hatch:
Just as we defend our lands, America also needs to defend our cyberspace. The data center will be part of our expanding efforts to defend our Department of Defense computer systems from cyberattack and will also play a key role in helping [the Homeland Security Department] keep our government’s civilian computer systems safe.
Eighteenth-century philosopher Jeremy Bentham called it the “Panopticon,” a concept for prison-building construction in which inmates never knew for sure when or from where they were being observed. It’s possible to experience the same sensation today that Bentham intended for the incarcerated back then. Passing near a surveillance camera, you’re not entirely sure if anyone is actively monitoring it, but the mere possibility is enough to adjust your behavior.
Three centuries later, meet the digital evolution of Bentham’s Panopticon. The New York Times describes training exercises held at a penitentiary in West Virginia where surveillance cameras and accompanying software analyzed the movements of officers role-playing as prisoners. Image analysis technology has been with us for some time, but the story describes a remarkable leap forward:
When two groups of inmates moved toward each other, the experimental computer system sent an alert – a text message – to a corrections officer that warned of a potential incident and gave the location. The computers cannot do anything more than officers who constantly watch surveillance monitors under ideal conditions. But in practice, officers are often distracted. When shifts change, an observation that is worth passing along may be forgotten. But machines do not blink or forget. They are tireless assistants.
An expert on local police intelligence says suspicious activity reports, which we wrote about recently, are also nothing new. Patrol officers have long scribbled reminders to themselves about possible trouble they witnessed on the beat or drew up field-interview cards with useful information. But the notes ended up disorganized in filing cabinets or at the floor of an officer’s locker, Stephen Serrao, a former New Jersey police captain and intel expert, explained in 9-1-1 magazine. A smarter strategy will be necessary for the future:
The key to the success of [suspicious activity reports] is to receive them, vet them, and conduct proper analysis – tapping the power of the large national dataset. For example, a SAR about a white van taking photos of the infrastructure of bridges needs to be checked against the national dataset to determine if there are other matches that might affect the threat assessment.
Federal authorities are already working on making suspicious activity reports generated by local police available to others nationally. But two issues remain. How will “suspicious” be defined and how can analysts avoid being overwhelmed by unreliable information not connected to terrorism or crime? Serrao doesn’t have all the answers, and as an expert commentator on police intelligence, he doesn’t typically address the issue of privacy or civil liberties. But he does shed light on what could occur as a result of poor coordination:
Efficiency is going to become more important as the “See Something, Say Something” program expands, as it did recently to include new sectors of the economy like hotels. Over time, the increasing volume of calls and reports will demand efficient systems to manage the intake and processing of these reports. … SARs data is going to be incredibly valuable – but only if we manage it correctly. We can’t use a siloed approach because keeping data in standalone platforms is a step back toward the old paper field interview cards.