As CIO Miguel Blanco sees it, productivity measurement occurs in two forms with big data teams. One regards the efficiency of the internal team itself, and the other, more significant one regards how much the data is expanding business capabilities. In an article for the Enterprisers Project, Blanco further discusses what areas CIOs should be scrutinizing in big data initiatives.
The Big Targets
The first area to consider is business team data access velocity. The business needs to be able to swiftly review the data streams that are selected and compiled by the big data team. CIOs must ensure no bottlenecks build in this process.
Another area of focus is data modeling and preparation velocity:
How long does it take data science teams to correctly model and prepare the data for analysis? Does the data modeling process take days, weeks or longer? How long is the competition believed to take? This is another crucial measure that has a direct influence on data access velocity and has to be counterbalanced to the overall business team expectations. CIOs should take a close look at the data analysis tools set and data stream origins to ensure the tools in use are adequate and not hindering the data model stage.
Lastly, consider business big data output absorption. In other words, how quickly is the business able to consume the insights yielded from data models, and how quickly does the business subsequently take action? Additionally, how useful is the data in the first place? It is both okay and normal for not every insight to produce earth-shattering change, but it is still important to keep a pulse on the impact that big data is producing.
You can view the original article here: https://enterprisersproject.com/article/2016/11/get-good-handle-these-metrics-ensure-success-big-data-endeavors