Battling Bureaucracy: Keeping Your Data Projects Moving

I'm finding that it's more and more important to talk about the human aspect of technology, especially at conferences that dive deep into the technical aspects of a discipline. 

This is the mentality I embraced when speaking at the SoCal Data Science Conference 2016, held at the University of Southern California on September 25, 2016. 

This talk was the 2.0 version of the one I delivered earlier in the year at Big Data Day LA. I shared 2 case studies of data-driven projects my team at the Mayor's Office is leading. Each story was anchored around a goal, a problem, how the problem manifested itself, and how we addressed it. 


The first story focused on my team's real estate initiative where we've been tasked with centralizing the City's real estate portfolio and configuring it into a cloud-based, accessible asset management system. 

The main problem we've faced is a lack of context around the City's real estate data. With highly decentralized systems, we end up with datasets that lack sufficient information to make holistic, enterprise-wide decisions. Prior to my team arriving in the Mayor's Office, the City interacted with it's real estate data in the form of rows and columns, cells on a table, and spread out across numerous datasets. 

This lack of context further politicizes an already political topic in real estate, creates confusion, and generates gaps in accountability. Combined, these issues manifested themselves in the form of bureaucrats slowing down contracts, forcing our team to create additional, often times unnecessary documentation to appease their concerns, and constantly providing verbal updates in meetings. 


To solve for the lack of context, my team has worked diligently to share real estate information and data through maps, we've also brought in a lot of community-related pieces of data focused on neighborhood councils, economic development initiatives, zoning limitations, and more. 

Additionally, we've not only put dots on maps but have pulled in real estate asset polygons using APIs, and we've leveraged the City's Geohub to import community layers about hospitals, schools, and more, into the City's asset management system so that policy makers and asset managers can interact with real estate properties within the context of the communities they exist within. 


The second story I shared focused on our employee safety and wellness initiative. Here the goal is to keep employees safer and to bring them back faster if they get injured. The problem is that there has been a lack of actionable information. Much of the data related to employee accidents and workers' compensation claims was being shared as excel files looking at 1 month of information, with a handful of metrics and a chart or graph. 

This type of information exchange made it difficult to identify a starting point to start solving problems, the time period being regularly analyzed was too short, and the method in which this information was exchanged did not allow for conversation. 


To solve this, my team created interactive dashboards using business intelligence tool Tableau, brought together 12 months of data from across 2 main systems, and held discussions with the leadership of the top 8 cost-producing departments. These meetings were educational conversations where my team learned about the specific roles these departments played, the types of work their employees perform, how they would prefer to receive information, how they currently use data, how they aspire to use data, and how the Mayor's Office can support these visions. 

As a result of the roadshow, the parties involved in keeping employees safe wil be enhancing their reporting, leveraging technology and analytics, and industry standard best practice metrics have now been added the general manager's reviews by the Mayor. 

Be the first to comment

Please check your e-mail for a link to activate your account.