How IRS' New Tool to Thwart Tax Cheats Also Helps FEMA Respond Faster
byOct 24th 2012 10:37AM
Internal Revenue Service Commissioner Douglas Shulman (center) testifies before the Senate Homeland Security and Governmental Affairs Committee
For the IRS, analytics is translating real time data into early warning signals that are helping the IRS spot tax abuses more quickly, resulting in millions of dollars of savings for the Treasury Department in potential bad credit claims.
For many executives, though, what analytics actually means, how agencies are using it, and what distinguishes it from traditional workplace performance management measures remains a murky subject.
That's in part because analytics represents a relatively new and evolving approach to diagnosing hard-to-see and hard-to-measure factors that can affect how successfully teams perform their missions.
It's also a discipline that tends to require right-brain, left-brain leaders who can forge together analytics teams possessing the technical skills needed to analyze big data sets but who also understand the more subtle nature of human and social behavior.
And, as a new report concludes, there are also many ways to apply analytics. The report, entitled "From Data to Decisions II: Building an Analytics Culture," was released this week by the Partnership for Public Service and the IBM Center for the Business of Government. The report highlights how a half dozen federal agencies are using data analysis to deliver better outcomes for their stakeholders and the lessons they learned.
The IRS's Office of Compliance Analytics is an example of both the emergence of analytics in the workplace and how data analysis is helping agencies take a more systematic approach to improve performance.
Dean Silverman, senior advisor to the commissioner, heads up analytics initiatives at the IRS. He shared his experiences at a forum accompanying the report's release (pictured second from right above). One of his team's pilot projects was to help IRS employees identify tax preparers early in the 2012 tax-filing season who were making repeated errors. The goal was to intervene, work with the preparers and reduce subsequent errors.
To accomplish that, the agency batched the returns filed by a couple hundred tax preparers, pulling real-time information daily and within two weeks, identified the symptoms of faulty filings. The pilot project eventually led to more than $100 million in estimated savings in bad credit claims that did not have to be processed.
"What's different today is the power to analyze activities in real time," said Silverman, speaking at the conclusion of a panel discussion. "That's a crucial difference because "people are continually trying new tax-cheating schemes" and because traditional remedies can become outdated quickly. "It's not always clear when a new scheme is emerging. The trick is to identify them with only a couple of data points," he said.
Using analytic tools, Silverman's office "tries to anticipate the next incarnation at the same time it diagnoses and fixes existing problems."
Carlos Davila, director of FEMA Recovery Directorates' Business Management Division, is another believer in the power of analytics. Davila is helping various teams get a clearer picture of how well recovery activities and services -- such as housing, financial assistance, legal services, cleanup funds and rebuilding efforts -- are achieving the desired results.
"Most people are accustomed to focusing on inputs and outputs. But we also needed to measure how this affecting our employees," he said.
"You need to identify the questions you want to answer, what data you will collect to answer them, and how you will organize information to make it useful and standardize how you present the analysis," Davila said.
Turning data analytics inward on employee performance runs the risk of raising fears, however, that analytics is just another version of performance management measurements that could lead to unwanted scrutiny.
"The countermeasure for being afraid is trust," said Daniel Liddell, federal security directory for a region of the Transportation Security Administration. Liddell stressed that one of the lessons he's learned in applying analytics at TSA is the importance of using data analysis as a tool to enhance overall team performance.
Indeed that was one of the several over arching lessons captured by the report, which conducted focus groups and interviews with nearly three dozen federal managers and analysts representing the Departments of Defense, Health and Human Services, Homeland Security, Interior, Treasury and other agencies.
Judy English-Joseph, strategic advisor for the Partnership for Public Service summed up those lessons, saying "Federal agencies face major fiscal decisions and must be able to justify what they do, why they do it and what the nation gains from it".
Their research identified four common practices among agencies seeing value in applied analytics:
- Their leaders focused on transparency, accountability and results
- Their staff was given a clear line of sight to desired goals or outcomes
- Their agencies invested in technology, tools and talent to support analysis
- Their agencies' cultures cultivated and leveraged partnerships within and outside their agency
The report also assembled these recommendations to help agencies get started utilizing analytics:
Start with a systematic and disciplined approach. There's not one method or approach. The key is to gain a solid understanding of the agency's programs, goals and objectives. Use questions as a starting point, even if they can't be answered with current data. Take large issues and break them into smaller workable components.
Make analytics the way you do business. Leaders at all levels need to live by example. Recognize and celebrate success in using analytics. Communicate how analytics can help with understanding ways to better achieve program outcomes and impact.
Get the people piece right. Fundamental change must be led. Focus on bringing together the right people who are all in it together. Tap a mix of people with different backgrounds and strengths.
Silverman noted "There is a critical distinction between analytical skills and problem solving skills. Problem solving starts with curiosity or the scientific method. The only objective proof that analytics works is they have to solve a problem," he said.
A PDF copy of the full report is available for download here.
This post originally ran on on AOL Gov.
For more from AOL Industry, see: