Outsourcing Some Analytics Functions Could Keep You Ahead Of The Stampede For Talent

In March 2017, Dun & Bradstreet and Forbes Insights explored the current state of analytics adoption across the enterprise via a survey of more than 300 executives in North America, the U.K. and Ireland across a range of industries. A recent report, “Analytics Accelerates Into the Mainstream,” sponsored by Dun & Bradstreet, analyzes the survey results.

Some key takeaways:

Finish analysis, get to action. It is no longer enough to champion sophisticated data and analytics strategies to the C-suite. Analytics is no longer a “nice to have.” This survey illustrates that analytics has very high, cross-functional adoption across the entire best-in-class enterprise. Senior executives are understanding the need to invest in the people, processes and technologies that empower insights-based decision making and decision automation to keep pace with their peers or leapfrog the competition.

Analytics expertise required. Of those surveyed across business functions, 27% cited skills gaps as a major obstacle to their current data and analytics efforts. More than half of the organizations surveyed use outside partners for some or all of their analytics needs. In fact, 55% of those companies reported working with third-party partners to address the lack of skills. This strongly indicates a need for skilled data analysts across the enterprise spectrum. Sixty percent of respondents who use outside partners stated that internal staff did not have bandwidth for the analytics needs of their companies. Bringing in outside partners with analytics as a core competency enables organizations to scale up and scale down while adding critical capabilities.

Demand for data insights growing across all major industries and disciplines. The demand for data insights appears to be almost evenly distributed across a variety of industries. Manufacturing leads the pack with the most demand but is closely followed by technology, retail, financial services and insurance.
New use cases arise. An intriguing and enlightening perspective was revealed when the survey explored respondents’ value perceptions in advanced analytics use cases. Emerging analytics categories like geolocation, cyber security and the Internet of Things (IoT) all scored well, and 33% expressed interest in responsible business analytics covering key areas, including human trafficking and sustainability.

The impact of digitization on the enterprise has only just begun. IoT. Digital transformation. The consumerization of B2B. Pick any of the disruptive initiatives and what happens? Enormous amounts of data are generated. And unfortunately, while enterprises have spent billions to address infrastructure, workflow and transactional frameworks, many have not scaled their analytics solutions to derive insights and actionable intelligence from the growing onslaught of data.
Sophisticated analytics adoption seems to be taking some time, as 40% of companies surveyed are still using basic technology like dashboards and spreadsheets for analysis and reporting. This suggests that much of the analytics work being done today remains fairly basic.

Anticipatory analytics techniques have only been implemented by 15% of respondents, while predictive models that integrate internal and external data have been implemented by only 17% of respondents. This indicates that there is some competitive upside for companies that get ahead of competitors and start basing future business decisions on more sophisticated data sets and analyses.

Sophisticated analysis requires robust applications and high-performance data environments, and many companies have yet to make those investments. Exacerbating the situation is the difficulty of finding analytical talent with the sufficient skills and experience to create models and queries in order to make the most of advanced analytical applications.

Surprisingly in this business environment, where big data and analytics are being heralded as the new gold rush, only 45% of all companies surveyed are doing analytics at all—with 39% using a combination of internal and external resources, and 16% actually outsourcing the entire function. When we delved into the primary drivers of the outsourcing decision, it revealed a desire to have access to proven talent and best practices. In fact, 55% of the companies that outsourced felt their partners were delivering a superior work product, and 52% of them felt that outside providers helped them compete more effectively.

Microsoft Workplace Analytics helps managers understand worker productivity

Microsoft has long held that the Microsoft Graph — the data it collects as a byproduct of people simply using their online tools — provides a wealth of information that companies can use to understand their workers better. Today, the company announced general availability of Workplace Analytics, which has been designed to give managers and executives a broad understanding of employee productivity across departments based on Microsoft Graph data.

Last year, Microsoft introduced MyAnalytics to give employees a view of their individual productivity based on their calendar, email, documents, Skype and other information gleaned from their Office 365 usage data. At the time, TechCrunch’s Frederic Lardinois wrote that while employees could see that data, their managers could not.

Today, that changes, but Alym Rayani, director for Office 365 insists this is about using data to understand what behaviors create the most productive employees. “Even when you have quantifiable outcomes, inputs and behaviors are not always well understood. It’s important to know what “good” is. It’s even harder to [identify] the underlying behaviors that will drive that employee satisfaction,” he said.

Rayani says research has shown that increasing employee productivity and engagement is going to be a big priority for organizations moving forward. That requires insight and data, precisely what he says Workplace Analytics has been designed to provide.

For the initial launch, the product is looking at email and calendar data since this should provide a great deal of insight into how individual workers are spending their days, but over time they could be exploring other types of data being generated in the Microsoft Graph.

Microsoft is providing an overview dashboard inside Workplace Analytics and 4 standard views of organizational productivity including Week in the life, which looks at how the entire company spends time and collaborates; Meetings, which looks at the quantity and quality of time spent in meetings; Management and coaching, which measures how much one-on-one time employees are spending with their managers and Networks and collaboration, which looks at how employees are connecting across the company.
You may be thinking if it can look at positive behaviors and productive employees, it could also be used conversely to identify employees who are being less productive, but Rayani says throughout the private beta, not one company was using it to call out employees.

Instead he said it was about looking at output versus behaviors and finding ways to improve the outcomes. For example, managers could look at the activities of top performers and learn how those people spent their day, then use that data to teach other employees to use those techniques to improve productivity.

The product could require some consulting help from Microsoft to get going, especially if customers want to create custom dashboards beyond the ones available out-of-the-box. Customers could also potentially combine Workplace Analytics data with other data they are collecting across the organization to give management a more complete picture, even if all that data isn’t coming from Microsoft products. That kind of customization would also likely require consulting help.

As podcast analytics get better, who wins and who loses?

Having more specificity about podcast numbers will attract more advertisers and benefit the creators of shows that have passionate audiences, Quah explained.

“That is basically a black box,” Quah said of the “downloads” number. “That scares away a lot of advertisers that are accustomed to a higher degree of measurement and more specificity into whether your ads are being served.”

“It’s the equivalent of ‘I sent out a magazine today.’ You don’t even know if people looked at page four, which is where your spread is. It may just be in the back bin somewhere. That is an issue for podcasting if it wants to be classified as a digital product and to be accepted in the larger pool of buys.”

But the losers as Apple professionalizes its analytics are the creators without an engaged audience who have been selling ads based on that “downloads” number alone.

“There’s going to be a lot of podcasts realizing that they didn’t have audiences as big as they [thought],” Quah said. “And there’s going to be a smaller percentage of podcasts that realize they have a very, very engaged audience base and it’s as big, if not slightly less, than what they thought it was. So we’re really going to see a shakeout, resizing and reconceptualization of, how do you measure for the success of a show?”

If you like this show, you should also sample our other podcasts:

Recode Decode, hosted by Kara Swisher, is a weekly show featuring in-depth interviews with the movers and shakers in tech and media every Monday. You can subscribe on Apple Podcasts, Google Play Music, Spotify(mobile only), TuneIn, Stitcher and SoundCloud.
Too Embarrassed to Ask, hosted by Kara Swisher and The Verge’s Lauren Goode, answers all of the tech questions sent in by our readers and listeners. You can hear new episodes every Friday on Apple Podcasts, Google Play Music, Spotify (mobile only), TuneIn, Stitcher and SoundCloud.
And finally, Recode Replay has all the audio from our live events, such as the Code Conference, Code Media and the Code Commerce Series. Subscribe today on Apple Podcasts, Google Play Music, Spotify (mobile only), TuneIn and Stitcher.
If you like what we’re doing, please write a review on Apple Podcasts— and if you don’t, just tweet-strafe Peter. Tune in next Thursday for another episode of Recode Media!

How Big Data Analytics May Help Governments Protect Us

The primary role of government is to protect the safety and well-being of its citizens.

Since 2000, statistics reveal that the amount of information being captured by the federal government is increasing exponentially and, since 2013, big data capture is expected to double as each year passes.

Analysis of such data is nothing new. As far back as 1967, the then-Office of Education and Office of Economic Development launched a program that spent more than $600 million in an attempt to correlate low household income with academic achievement. And that was just the beginning.

The information that can now be obtained from government databases is enormous; but with government budgets under continual strain, Sean Brophy of Tableau Software explains that the pressure is on for “agencies to find and use higher-value, more flexible tools, especially ones that help them see their data faster and clearer.”

Not surprisingly, that has encouraged a new generation of Big Data enterprises; from BI suites from the likes of Japersoft and Pentaho to dedicated flexible big data analysis databases such as those offered by players like SQream Technologies, technological solutions abound to help best draw rapid insights from large volumes of data.

Current Projects

But how exactly can governments turn this data analysis into concrete federal programming, designed to protect the lives of citizens? Here are a few areas which can benefit enormously from accurate analysis of big data.

1. Transportation

The management of transportation suffers from any number of variable factors, including weather conditions and driver ability, but big data analysis can ensure that both federal and local governments are able to plan ahead. For instance, based on analysis of traffic movements, greater numbers of traffic officers can be allocated to specific locations, and better traffic calming measures can be installed at accident hotspots.

SPATIOWL by Fujitsu is one of the most developed services that deal with data coming from public transportations, vehicles, and pedestrians’ smartphones in urban areas through sensors. It analyzes across various data layers, and provides insights from the data to identify relevant actions in the user context.

2. Emergency Preparedness

Every population faces risks from one sort of natural disaster or another, from earthquakes to tornadoes to unexpectedly large snowfalls. Big data analysis is already providing new abilities to improve disaster prediction and management.

In recent years, huge amounts of data have been stored across most every scientific discipline. Whether we’re talking about high-resolution satellite imagery or seismic data, high-speed data analytics makes it possible to recognize patterns which may make it possible to anticipate natural disasters, and anticipate vulnerabilities in a city’s infrastructure.

But, even with better predictions, disaster management will remain critical. During and following an event, real-time analysis of vast networks such as wireless sensors and cameras, transportation grids, and even social media can alleviate suffering by improving the delivery of emergency services.

3. Climate Research

According to its website, the organization “has access to over 20 years of atmospheric data gathered during normal operations and field campaigns” and will “explore more than 350 instruments that collect data at locales spanning diverse climate regimes.”

Although unstated, the goals of ARM are to use the data gathered to further human understanding of our climate, and ideally ensure that we are better placed to deal with the increased threats from global warming and other climate change issues.

Conclusion

Overall, despite the sometimes confusing technological language involved and the sheer volume of data in question, the benefits of big data to the government’s mission to protect its citizens are clear.

With that in mind, in 2012, the Obama administration announced a $200 million investment in big data research-and-development programs, which spanned agencies including defense, intelligence, and cybersecurity.

As the results of these programs roll in, there will no doubt be an increased demand for such analysis as time goes on.

How to Understand User Behavior With Google Analytics

Want more conversions? Want to make smarter and better decisions about your SEO and other marketing activities?

Then you need to analyze what people do on your website.

Google Analytics is the single best tool to help you understand user behavior .

Here are three ways you can use Google Analytics to understand exactly what people do when they land on your website.

Use In-Page Analytics to Analyze Website User Behavior

In-Page Analytics is a neat Google Chrome extension that is useful for a couple of reasons:

Quick Access

In-Page Analytics lets you instantly connect to your Google Analytics account without having to log-in to the traditional interface (assuming you’re already logged into Google Chrome). This makes the process so much faster if you want to do a quick check to see how a particular page on your site is performing.

Like the traditional interface, you can select a date range for the data you’d like to see and instantly access page views, average time on page, and bounce rate all while you’re surfing your website.

If you really want to get fancy, you can also view your data by specific segments: all users, new users, organic traffic, paid traffic, and mobile traffic, just to name a few. You can even see how many people are on your site at that exact moment.

Page Analytics Google Chrome extension

Heat Map

What I love most about the extension is the heat map functionality. You can see exactly what percentage of website visitors click on any clickable area of your site. This is so useful because you can see if people are actually clicking on all of the buttons, links, etc. on any given page.

Let’s say you have a button with a call to action that says “read more”. You’ll be able to see the percentage of people who actually click on that button.

You may find out that your “obvious” call to action may not actually be so obvious. Based on the results from the heat map, you can make tweaks and improve the overall user experience of your site.

Check out this screenshot of the footer of my site:

website-visitor-behavior

My most clicked on links are to my services, portfolio, and blog. So much for the theory that your about us page is the most popular page on your site.

This is why understanding your analytics and the data specific to your site is critical to understanding how your audience and your website users behave. Knowing general trends can be useful, but nothing is better than knowing and understanding your own stats.

Set Up Conversion Goals to Track Website User Behavior

A goal allows you to measure how often a user takes a specific action on your site.

Google Analytics goals let you track the conversion rate of a particular action you want your website users to take. Goals can be actions (e.g., signing up for a newsletter, signing up for a free gift, making a purchase, or watching a video).

Why is this important?

It allows you to translate your business objectives into measurable actions within your website. Instead of just knowing how many people land on a page, you’ll know what specific action they’re taking (or not taking) on that page.

This information will give you valuable insight into what calls to action you may need to tweak or possibly remove from any given page.

To set up a goal, go to Admin > Goals > New Goal > Select template or Custom (I prefer Custom).

Next, give your goal a name and select the type of goal:

Destination: The goal is complete when a user lands on a particular page.
Duration: The goal is based on how long it takes a user to complete an action.
Pages/Screens per session: The goal is complete when a user browses a certain number of pages.
Event: The goal is complete when a user triggers an event (such as watching a video).
Lastly, you’ll enter the specific details based on the type of goal you selected and how much detail you would like to see.

For example, you can assign a monetary value to a goal to view the approximate amount of revenue you’ll generate for each completed goal.

Setting up goals in Google Analytics

My favorite type of goal for tracking website user behavior is a destination goal. It’s simple to configure and it can track visitor actions such as signing up for a free lead magnet.

So as an example, let’s say you have a sign up for a free gift located on several pages within your site. A Google Analytics goal will measure the percentage of users who are actually signing up for your free gift for each page that has your sign-up form. This allows you to not only see how many people are signing up for your free gift but more importantly, you’ll see how many people are signing up on each page.

Google Analytics puts your data into context to easily see which pages are converting the best. Knowing this, you can send traffic directly to the high converting pages and split test to improve the performance of the lower converting pages.

Often times people think their largest traffic sources correspond to their largest number of conversions, but that’s not necessarily the case. By only looking at your traffic reports and not setting up goals, you can miss this piece of the puzzle.

To put this into perspective, let’s say your traffic reports tell you that the majority of your traffic is coming from Pinterest. But when you view your goals, you may see that the majority of your conversions are actually coming from search engine traffic. So even though you may have less search engine traffic than Pinterest traffic, based on your goals the search engine visitors are more inclined to opt-in to your list. Tracking your goals will allow you to easily determine this and you can then adjust your marketing strategies accordingly.

View Behavior Reports to Understand Website User Behavior

Behavior reports are the best tools to use for diving deep into website user behavior.

While it’s great to know the type of people who visit your site (Audience Reports) and how people find their way to your site (Acquisition Reports), the best reports for understanding what actions people are taking when they’re on your website is the suite of Behavior Reports.

Google Analytics Behavior Reports

Your Behavior Reports can answer these questions and much more:

Are people actually following the user path you intended?
What page or blog post are people using the most to sign up for your newsletter?
Which pages are people staying on for the longest and least amount of time?
What are people searching for in the search bar on your website?
What events (such as watching a video) are people doing (or not doing) on your site?
Which pages are converting the best?
Which calls to action are not converting and should possibly be removed from the site?
Summary

Knowing how people behave when they’re on your site will help you understand your conversions and how you should structure your marketing campaigns to increase your ROI and overall success.

Analytics and the NFL: Finding Strength in Numbers

Paraag Marathe arrived in San Francisco with a very clear directive from the 49ers, to reimagine the Jimmy Johnson draft chart. And after a couple months work, the senior associate from the Boston consulting firm Bain readied to show team president Bill Walsh and GM Terry Donahue his findings, in disbelief at the results being spit back at him.
It was uncanny.
“I tried to use historical trends and true value,” says Marathe, now the Niners’ chief strategy officer and EVP of football operations​. “And it wasn’t like Coach Walsh was telling me, ‘Hey, a third rounder has more value than it says here, and a second rounder has lower value.’ It was meant to be totally independent. And once I was finished, I looked at all of Coach Walsh’s trades over the years. And it was a total match with all of Coach Walsh’s trades.”
All that work, and all that was proven was what some guy on his couch down the street in Santa Clara could’ve told them: Walsh was a savant.
That was 2001. Moneyball—the best-selling book based on the baseball team across the Bay from the Niners, which would shed light on the coming numbers craze—was still two years from release. The idea of using advanced statistics to drive decision-making in baseball was still in a nascent stage, at least publicly. And that, of course, implies the truth about the NFL then, which is that few in football had even given that concept a thought.
It’s a different time now. Analytics in the NFL have moved well beyond the point where a team hiring a consulting firm to run numbers constitutes outside-the-box thinking. Yet, there remains resistance, a battleground of thought, and a general cloudiness on how far you can take numbers, and how far numbers can take you.
And just the same as 16 years ago, you can place a genius in the center of it.
Perception holds that the Patriots are among the league’s most progressive teams, but there’s precious little evidence of their investment. They have people who are responsible for advanced statistics, but coaches and scouts are largely charged with integrating data gathered into their work. The “analytics guy” there is 64-year-old Ernie Adams, a former Wall Street trader and prep school buddy of Bill Belichick’s.
So why are people so convinced that the five-time champions are knee-deep in the numbers? As one informed long-time NFL exec explains it, “It’s because they’re completely consistent with what sophisticated analytics would tell you to do.”
“[Belichick] does it with intuition,” says one AFC executive. “You know because you’ve been coaching for so long, how you match these 11 guys against those 11 guys. It all makes sense to you. At some point, maybe we can all come to those conclusions without having Bill Belichick’s brain. We’re still a long way from that.”

The NFL is getting closer. The MMQB spent a month discussing analytics with more than 40 team officials from across the league—coaches, executives, scouts and analytics people—and there are some hard conclusions that can be made on where the league is.
1. Most teams don’t shy away from analytics. In fact, more than one official was offended by the notion that their team would be called “old school”. When it comes to player acquisition (which is what Moneyball was based on), the average NFL team is using the data. It’s just that it is being used to generate boundaries rather than drive decisions. Teams want to know when they’re making exceptions on one player, and they want to know what they might be missing on another they may have otherwise dismissed.
2. On the coaching side, analytics are generally used to make staffs more efficient. There may only be time for quality-control coaches to break down four or five of an opponent’s games in the week they have leading into a particular game. And that, in the past, would lead to guesswork on tendencies and strengths and weaknesses. The data allows the quality-control guys, and staffs, to crosscheck against larger sample sizes.
3. The limits in those two areas are the number of games (16) and the variance in players’ assignments and situations that affect plays. That makes it more difficult to collect the amount of data necessary to make hard decisions.
4. Conversely, the value of the data in those areas is proven in that nearly the entire NFL has subscribed to Stats LLC and/or Pro Football Focus, and some rely on smaller services, like Pro Scout Inc., which is run by 64-year-old former Utah coach Mike Giddings.
5. There aren’t sure marks of analytics-friendly operations on game days (as there is in basketball, with teams that go for the “two-for-one” possession at the end of the quarter or half). But on the personnel side, you can see it in asset management, with teams that trade down in the draft to pad their margin for error, and use cap space creatively.
6. There is one strong consensus league-wide: Analytics data related to injury prevention, which straddles sports science and comes through player tracking, is useful and will only get better. The NFL is just scratching the surface with this technology, and the floodgates will truly open only when the league makes available all the Zebra data it’s been collecting. Another step here that’s expected to come eventually is in profiling the minds of players.
7. The Walsh/Belichick robot is not on the market. Yet.
• SMARTER FOOTBALL WEEK: A series examining the cerebral side of the sport, including technology, analytics, how a brainy linebacker prepares and just what goes into a typical NFL play.
* * *
The cliché is still going strong. On one side you have the gym teacher with the whistle around his neck, on the other it’s the dweeb with the taped glasses, pocket protector, and stacks of hard-to-decipher numbers and data. And, if you believe the cliché, it continues that football’s tough-guy culture has made it slower to change than other sports.
The truth is more complicated. Part of the problem with finding analytics’ place in football is the term itself. Often, the perceived hesitancy to embrace quantitative analysis in the NFL is due to the fact that what is often referred to as “analytics” was already a huge part of the game. It just didn’t have that name.
The game itself is rooted in tactics and strategy and details, and so the study of those has always been inherent in the coaching of the sport and building of its teams.
“I think most of the stuff we’ve done for a long time,” says one NFC general manager. “[Analytics] is the buzzword. That’s how everyone clumps it together now. Like, until five years ago, I’d never heard combine data called ‘analytics.’ But now, someone smart can make it look pretty—analytics. To me, analytics is the Pro Football Focus numbers, finding a way to put a numeric grade on every play and quantify it.
“No one has enough people on staff to do all of that. A lot of it is just statistics, and then what the smart computer guys can make it tell you.”
Photo: Carl Iwasaki/Sports Illustrated
Bill Walsh was ahead of his time on analytics, even if he didn’t know it.

There’s a lot of gray to cut through. Quality control coaches have forever done what are now considered game-week analytics—identifying opponents’ tendencies, finding trends, and setting up their bosses to game plan. On the personnel side, scouting assistants have, likewise, undertaken analytical studies for decades to uncover advantages and inefficiencies in the draft and free agency.
“Up until 15 years ago, football was probably the furthest along of any of the sports as far as studying the game itself,” says an analytics manager for one AFC team. “It just wasn’t guys from fancy schools doing regression. But there was real systematic study of the game. And frankly, basketball and baseball only started doing the kind of film study that football has done forever just recently.”
So here’s where you start: The reason there isn’t an NFL team ignoring analytics is because analytics has been done in football since Paul Brown came along.
Some teams have very little in the way of analytics staffing and, based on their business practices, are considered advanced. Others have entire departments in place and those who study analytics wonder, based on the way those teams operate, if all their number-crunching has influenced even a single decision.
How is that possible? Well, as Jaguars SVP of football technology and analytics Tony Khan—the son of owner Shahid Khan—explains it: “The adoption rate is far behind other sports.” More than three-quarters of NFL teams employ either a director of analytics or have a full-blown analytics department. Others have their cap managers oversee it.
The divide in the NFL comes in how the data collected is put to work.
“There are a lot of skeptics,” Khan says. “And that’s honestly probably on the analysts and the statisticians. You have to be able to explain it to the football people in their terms. That’s why I try to study that, to be able to communicate with coaches and scouts on a meaningful level, instead of forcing them to use our terminology.”
It’s easy to figure out where and why the divide happens. Football lacks the repetitive one-on-one situations that make up a large portion of baseball and basketball, and it’s tougher to figure out what each player’s assignment is without hearing it from a coach. Those assignments are more divergent, too—in basketball, a center’s role on defense can be affected by the opposing point guard’s movements and decisions and the spacing of all the opposing players; in football, a right tackle’s objective doesn’t relate much to what a free safety is deployed to do.
That makes putting values and creating apples-to-apples judgments on players difficult, and the idea of filtering gameday decisions through a strict set of rules problematic. It also makes it challenging for services like PFF and Stats LLC to find the right way to assess players and plays on their own, and generate value for coaches and scouts.
Then, there’s the issue of sample size.
“In baseball, you have 162 games, and the same exact play starts every play, and that’s the beauty of it,” says one AFC general manager. “You start to appreciate the sheer volume of the numbers, and how they compare over time. In basketball, there’s a natural progression—up and down, up and down. The event of the team going up the court happens so many times, you can chart shots, rebounds.
“And so much of it in both sports is pass/fail. You break that down with the amount of times the events repeat themselves, you can build big data.”
In football, on the other hand, you get anecdotes like this one: There is one team that has a club exec who heads up data collection. He’ll call his head coach into his office and hand the data to him. But the report never makes it to the coach’s desk, only out to the hallway where, between the two offices, there is a trashcan.
• WELCOME TO 1967 WEEK: Crazy players, an unforgettable expansion season and an ill-fated third-place game. A look at what the game, the players and the culture of professional football were like a half-century ago.
* * *
Some teams have analytics directors. Others have departments. Some integrate data and make football people responsible for working it into their jobs. Others keep the sides separate, with an over-the-top manager (usually the GM) sorting it out. Some build their own systems. Others have the numbers people there only as window dressing.
One commonality? Most subscribe to services. Cincinnati-based Pro Football Focus now lists 27 teams as clients, and founder Neil Hornsby says, “I’d be surprised if we haven’t got 32 of 32 by next season, if not this season.” Similarly, Stats LLC has 26 teams as subscribers. The reason? It’s a way to become more efficient and guard against missing anything, even if you’re still not sure how to use all the numbers.
“I come from a consumer product and enterprise software background,” says John Pollard, a former exec at Stats LLC who now works as a consultant. “And what struck me in the early 2000s is when more information and research and analytics became available in over-the-counter pharmaceuticals and retail, it was just too much information. And we’re going through the same thing in sports now.
“It’s being driven by the information. Technology provides the efficiency. And analytics provide more effectiveness in the decision-making. The goal isn’t to replace people. It’s to make them more effective and efficient.”
Tony Khan heads up the analytics operation for his father’s team in Jacksonville.
Photo: Bill Frakes/Sports Illustrated
Tony Khan heads up the analytics operation for his father’s team in Jacksonville.

On the personnel side, there are clear examples of where the numbers are leading to change. In 2009, Dolphins linebacker Joey Porter had nine sacks, but those sacks accounted for an unusually high percentage of his hurries. The Cardinals signed him, at 33, to a three-year, $24.5 million deal anyway. He regressed (6 sacks total over the next two seasons), as the numbers indicated he would, and Porter was out of football before the third year of the contract.
Last year, Bills linebacker Lorenzo Alexander, at 33, had 12.5 sacks. But similar to Porter, those numbers accounted for a high percentage of his pressures, adding to concerns that a clear outlier season in his career would remain one. He hit free agency, and returned to Buffalo on a two-year deal worth less than $6 million.
That’s not to say the aging Alexander would’ve broken the bank in a similar spot 10 years ago. But it did give teams a clear red flag without having to look at 16 games of tape while assessing hundreds of players in building a free-agent board.
And on the coaching side, it backstops a quality control guy’s work. Most QCs only get through four or five of an opponent’s games in a week’s work of prepping a report. Having the extra data on tendencies, situational play, and use of formations and personnel doesn’t mean that assistant can watch more games. It does mean he can crosscheck his findings from the ones he did see.
“The problem with football, they still base a lot of analysis on the previous four games,” Hornsby says. “So let’s say you’re playing Cincinnati, and you want to look at their tendencies when they’re in base personnel. You might wind up with 40 snaps out of 280. And then you’ll make a judgment. Well, of those 40, how many were on third down? How many came on second down?
“So what do you do if there are say, two or three snaps on second down out of base personnel? You have no option but to guess. What you can do with analytics, you’re growing the base data in areas that really matter.”
In many ways, it’s analogous to what technology like XOS has done for film study—where you can call up plays by down and distance, area of the field, personnel grouping, formation, even hashmark. Coaches can now crosscheck their study of four or five games through the data from full seasons, the same way they check out plays by situation without going through a zillion old beta tapes.

Coca-Cola to shut down Pune analytics centre

NEW DELHI: Coca-Cola is downing shutters of its biggest analytics, technical and innovation centre in the country, based in Pune, as part of a restructuring undertaken by the beverages maker to become a leaner organisation.

The Eurasia analytical services facility, set up with an investment of Rs 18 crore in 2010 to support operations in India, South Asia, Eastern Europe, Southern Eurasia and the Middle East, was among the US-based firm’s six such centres globally.

“Over the past few months, you have been witnessing our organisation’s strategy getting repurposed and restructuring businesses across the world. In light of the same, the company has decided to consolidate lab facilities across the world and as a result, we plan to cease operations at the Pirangut, Pune lab location by December 2017,” an internal email circulated to senior employees said. ET has seen a copy of the email.

A Coca-Cola India spokesperson confirmed the move.

“This is subsequent to our announcement in February 2017 to design a new operating model that supports our growth strategy as we transform our business into a true total beverage company,” the spokesperson said.

“For the work the lab in Pune is doing today, the company will fulfil the majority of its analytical needs by using labs at its bottling operations and by leveraging third-party accredited analytical labs in India. Additional support for the Indian operations will also be provided from other laboratories of the company.”
When traveling, Safar Karo… Suffer nahi !Bajaj Allianz Ready-to-occupy Apartments In Valasaravakkam From 1.86 CrLancor Holdings Ltd. Recommended By Colombia

The company will provide compensation for all the impacted resources as per its policy and in adherence to local laws, he said.

The email also said all impacted employees at the centre will be “appropriately compensated”. The decision to close the centre was made as part of a comprehensive review to reshape and restructure businesses across the world, it added.

The centre was created as a ‘world class facility’ to offer analytical and technical support, and provide capabilities to enhance quality and food safety standards, drive innovation and enable Coca-Cola’s long-term strategic growth objectives as defined in the 2020 Vision, the company had said at the time of its debut.

Analytics Release for MangoMap

Maptiks and MangoMap have announced the integration of user engagement analytics with MangoMap’s web map applications. Without any coding required, it is now possible to view in-depth analytics with key metrics on usability, performance and a visitor’s interaction with a MangoMap application. You can learn more about Maptiks at www.maptiks.com and MangoMap at www.mangomap.com.

Will Cadell, CEO of Maptiks spoke about the integration: “MangoMap has grown its user base significantly over the past year, and we love their platform. We believe in making it easier for everyone to build beautiful maps that tell interesting stories. And MangoMap is a great tool to do just that, without having to actually write any code. Moreover, we’ve had many customers ask about getting an integration done with MangoMap, so as of today all MangoMap users can easily start tracking their visitors interactions.”

Analytics have become a critical tool to have in a geospatial developers toolbox over the last few years. Previously MangoMap only provided an integration with Google Analytics, but many of their users were requesting data regarding visitor interactions, something Google Analytics couldn’t provide. They turned to Maptiks to help.

Maptiks is purposefully built for web map applications. This allows it to track visitors and events on the map itself, allowing for more in-depth analysis. It provides user activity analytics for most modern web mapping platforms. Helping answer key design and U/X questions like “how and where are users interacting with a web map.”

About Maptiks

Maptiks, is a startup based out of Prince George, BC. Founded by Will Cadell, Maptiks is built by Sparkgeo, a geospatial web company with over a decade of experience in the GIS & Technology industries. Working with clients like Nextdoor, Wildlife Conservation Society, and Map My Fitness, Sparkgeo has built Maptiks to measure how users interact with web maps increasing map conversions so they can #buildabettermap.

NBA analytics: Going data pro

For the NBA, like every other sports league, awards are important. They can generate attention, spur debate, make money, and involve fans, players, and experts, among others. Is there data science and analytics behind them — can there or should there be? We picked the NBA Most Improved Player award as an example to analyze some aspects of data-driven culture.

The NBA is announcing its yearly awards today. This is a much-anticipated event that has been talked about and analyzed extensively on sports media and beyond. Predictions and arguments on who should be nominated and who should win each award have been going on almost since the beginning of the season.

Keeping fans engaged is good, but there are more aspects to awards like these: They can give the media something to talk about, boost player and team statuses, and anyone can bet on the results.

Being part of pop culture, and having the potential to make or break careers and fortunes means there’s more to the NBA Awards that meets the eye. Let’s try and peek behind the looking glass and use data science and analytics to answer a question on many NBA fans minds: Who was the most improved player (MIP) in the NBA this season?

Define ‘improved’
To begin with, who gets to define improved, and how? As one NBA writer once put it: “There are few things more frustrating than trying to determine what it means to be the MIP”. On the other hand, that makes it interesting and open to interpretation. Since the NBA does not say much about its criteria and evaluation method, others have tried to come up with their own.

The traditional way NBA writers do this is by assembling a panel of experts and getting them to weigh in. Averaging expert opinions may be more on the objective side than just getting one opinion, but it still does not count as data-driven research in most data analysts’ books.

Adam Fromal from Bleacher Report argued that the MIP is “some years handed to a player who maintained his level of play (or even regressed slightly) while filling a much bigger role. Other times, the league rewards a contributor who made noticeable strides on both ends of the floor and did legitimately improve on a per-minute basis. Stars can win for reaching a new level, though the award often goes to a low- or mid-level rotation member who made the jump to legitimacy. Here, we’re accounting for everything by remaining entirely objective.”

That’s a strong claim there. Here’s what Fromal did and what we can learn from this.

Fromal’s methodology was based on weeding out players who improved for no reason other than newfound opportunity, and grading players by how much they improved in two different overarching statistics. Fromal wanted to reward both players who get better on a per-minute basis and those who stagnate while filling bigger roles.

Fromal presented his analysis in what he called “a countdown that intentionally eschews subjectivity.” That has not always been well received by everyone. Fromal has received anything from profanity to accusations of bias, and he has also been hilariously plagiarized, as the copycat misinterpreted his results. But how well did Fromal do?
Fromal’s top three includes two out the three players nominated by the NBA for the MIP — Giannis Antetokounmpo on No. 2 and Rudy Gobert on No. 3. His No. 1 was Myles Turner, a player overlooked by pretty much anyone else. Fromal missed Denver’s Nicola Jokic, who for most analysts and fans was an obvious contender.

This may give Fromal some objectivity credit, as he is a Denver resident, but begs the question as to where did the NBA and data-driven analysis part ways. The answer perhaps lies in what Fromal himself notes: Sophomores (like Turner) are typically expected to improve. In other analyses, sophomores are excluded from the MIP discussion.

Still, how can Jokic not be in that list? Is it Fromal that’s missing something obvious, or the NBA that has its own way of thinking? Perhaps, more importantly, should it? Does the NBA see something Fromal’s analysis does not, or do people there make their choices not entirely based on data-driven criteria and methods?

Data, meet eyes
Fromal is a professional NBA writer, and although he does not have a formal background on data analysis, he seems to be doing a lot of that for his work. Jay Spanbauer on the other hand is not a pro by any means — just a Bucks fan who began looking at the game in a different way with the influx of math and data in the NBA. But Spanbauer’s data-driven analysis succeeded where Fromal failed.

Both analyses were done before the NBA announced MIP nominees, but Spanbauer’s preceded Fromal’s by a month and narrowed down the MIP battle between Jokic and Antetokounmpo. Not only that, but he also pointed to a difference between them that may also lead the NBA to give Antetokounmpo the MIP award in a race that seems mostly between those two: Defense.

Spanbauer used a metric called Defensive Win Shares to show the biggest difference between the two. He pointed out that despite defensive ability being difficult to calculate accurately, it can be seen that Jokic sits below the league average, while Antetokounmpo is over two-and-a-half times higher. Maybe it’s obvious now, but nobody else seems to have used data to nail this when Spanbauer did.

1-xizncsmshpsmsh26xjx8-w.jpg
It may seem obvious now, but not many people thought about comparing MIP candidates based on their data and visualizing this for others to see. (Image: Jay Spanbauer)

That is a clearly defined metric and difference, but then why focus on these two players in the first place? Unlike Fromal, Spanbauer went with a combination of instinct and data:

“I think ultimately data should be used to ‘check’ what our eyes see. Anyone who watched the NBA this year saw the amazing leap Giannis Antetokounmpo made. A closer look at his numbers confirm this.

With nonstop coverage, blogs, Twitter, etc, there is enough information and enough discourse for a group of nominees to be fairly decided. I still have confidence in the traditional way nominees are selected, especially for an award as ‘open-ended’ or ‘fluid’ as the MIP. Case-in-point: The clear nominees for MIP in 2017 are Antetokounmpo and Jokic. Looking at numbers and crunching data would likely bring you to the same result.”
Except it did not — at least not using Fromal’s metrics and data. Which brings us to a key point: Even when something is based on data and has clear definitions, that does not make it a God-given truth. Data makes supporting a point of view more credible, and it can also allow discovering patterns that may be otherwise hard to spot. But data-driven does not necessarily equal indisputable.

Antetokounmpo has a back story worthy of Hollywood, is infinitely ambitious and yet keeps his feet on the ground (when not flying above the rim), is a fan and media darling, has been improving dramatically and has entered super-star status. You could say that was perhaps the most obvious choice possible, but going by numbers alone it would have been Myles Turner for MIP.
The problem with data-driven decision making
We already mentioned the “no sophomores for MIP” rule used by some NBA analysts. If the NBA would have gone for that, Turner would be justifiably not be a MIP nominee, but neither would be Jokic. So, if Turner’s number’s are better than Jokic’s, what is the NBA’s reasoning there?

That, or going with Myles Turner for MIP, is the kind of thing that can heat up debates. It can also serve to point out a couple of facts about data-driven decision making.

Coming up with the “right” criteria is hard and ad-hoc. So maybe the criteria for MIP should come down to what Fromal used. And maybe sophomores should be excluded, except in some cases. But then what would those cases be? What about players who make a comeback after a bad year? Or nodding to a player that could use the encouragement, or a market that the league wants to grow?

Whether any of the above are legitimate criteria — or if and how they are considered by the NBA — is open to interpretation. Sometimes such overall organizational goals and drivers are clear, sometimes they are not. But let us not forget: Organization executives have a huge influence on these, regardless of whether data is used to capture and evaluate them.

Going from criteria to metrics is hard and ad-hoc. Let’s suppose that someone has somehow narrowed down the MIP criteria and written them in stone. What is the metric that best expresses each one? And how should they be combined with each other to derive an overall score?

Even the most widely used metrics were derived by someone at some point and entail their creator’s bias and shortcomings — perceived or otherwise. In the case of basketball, probably the most widely known metric is the PER. Whether that is the best overall metric to capture a player’s ability and influence in the game is being debated.

There are more metrics, too, which are constantly evolving, and most of them require some degree of expertise in both the domain (basketball) and the techniques (data science) to be able to fully comprehend and evaluate.

bigdatamaturity.png
DataOps is the culture and practice of using data and analytics to drive decision making in organizations. But it is not infallible. (Image: Qubole)

Having the right data for the job is hard and not a given. Some data used today to derive information about NBA players’ defensive ability, such as steals and blocks, were not recorded until the 70s. This reflects not just the rising importance of data everywhere, but also the evolution of the domain itself.

When the importance of defense in the game of basketball got more recognition, that data found its place. Gradually, more and more data is being added to the NBA arsenal, including visual and spatio-temporal data, hustle statistics, and social media content.

The process there is two-way. Sometimes someone will come up with an idea to quantify something for which there is no data, and sometimes the option to have some data available can be used in unforeseen ways.

Working with five year olds is hard, period. Perhaps unsurprisingly, not everyone that cares about the NBA gets or cares about data and analytics. MIP nominees have not expressed any sentiment toward such analyses, and not many fans seem to be out there doing what Spanbauer did.

Some might say fans and players are more like five year olds anyway, but the truth is if things are not simple enough that a five year old would get, NBA analytics will be at the state all other analytics are right now: Something a few experts and some enthusiasts can use, some others have heard of and can maybe follow, and for most remains mumbo-jumbo.

Like all analytics applications, to apply NBA analytics, the right data sources need to be found, data has to be processed and integrated, domain knowledge applied, analysis done, and results visualized and explained.

So, should the NBA be more transparent about the criteria for its awards? And what would be the result of doing this? Could it make everything deterministic, taking the fun — and money — out of it?

Going data pro
Spanbauer is not the first non-pro to engage with NBA data analysis. There is an array of NBA analytics enthusiasts, and a number of people who work professionally in the area. And the borders between the two are not always clear, as the Seth Partnow story shows. Partnow is an ex-blogger turned analyst who now works with the Bucks. John Hollinger, the person who introduced the PER, now works for the Grizzlies.

But what are people using NBA data and analytics for? It depends on who they are, what they are after, and what tools they have available. What you can do with bedroom analyses will only take you so far. For some things, high school math + spreadsheet/internet + casual fan knowledge + a few hours will do. For others, it’s probably more like a PhD + IBM Watson + basketball guru status + a few months.

We all know the film Moneyball, and many basketball fans are familiar with Kawhi Leonard’s progress through analytics. We also know how top teams in all sports are gradually becoming data driven, and we’ve seen IBM’s Watson being touted as the tool to help NBA teams. Some of us heard about the fallacy of the Hot Hand fallacy.

For teams, the first priority is to analyze the game of their own players and that of their opponents in order to improve and counter it respectively. Scouting for new recruits is also of importance, and at the end of the day, it all comes down to winning more games, which also qualifies for making more profit.

NBA teams seem to be using analytics applications across the spectrum: From understanding what has happened and why, to predicting what will happen, and to making it happen — descriptive, diagnostic, predictive, and prescriptive analytics.

data-lake2.jpg
Data in all shapes and sizes used everywhere, and the NBA could be no exception.

For betting enthusiasts, it’s not so much about the game itself, but mostly about trying to make the right prediction that will turn them to winners. For fans like Spanbauer, it’s mostly about getting more insight in the game. As a representative of a data-driven culture trickling down, his views are interesting:

“It’s difficult to ignore the role and affect advanced metrics and statistics have had in basketball – as well as other sports. While analytics isn’t the only way to analyze the game, I like to think of it as another lens through which to look.

I wouldn’t say that analytics is all about predictions. Or even results, to be honest. It’s about altering the traditional mindset of organizations, and the evolution of the front office. You see more money being spent on research, and more jobs opening up in the field of analytics.

At the end of the day, numbers are just numbers. A lot of attention is paid to them – sometimes more than necessary. The human element of the game cannot and should not be ignored. There are still intangibles that we have not been able to measure, and perhaps will not be able to measure.

That said, we should still continue to seek answers by utilizing as much data as we can. The more numbers and information fed into any model will provide more accurate results.

I don’t necessarily believe awards should have some sort of criteria. Awards are for the fans, and part of the fun of the awards is debating amongst other fans your opinion of who should or should not win. However, with award selections dictating salaries and benefits like the designated-player exception, some consideration into criteria should certainly be given.

There is enough information out there to justify spending money and using analytics to measure many different areas. Owners and general managers are on their own to decide whether or not they want to trust themselves, or the numbers.”

CIIE-IIMA Incubatee FORMCEPT Raises Funding From GVFL

Data analytics startup, FORMCEPT Technologies and Solutions Pvt. Ltd has raised an undisclosed amount of Series A funding from VC fund GVFL (Gujarat Venture Finance Ltd).

The company is an incubatee of the Centre for Innovation Incubation and Entrepreneurship (CIIE) – IIM Ahmedabad. GVFL has invested in FORMCEPT through its recent GVFL Startup Fund. With this move Sankalp Bajpai, Vice President, Investments, GVFL Limited, will join the company Board.

Founded in 2011 by Suresh Srinivasan and Anuj Kumar, Bengaluru-based FORMCEPT is a unified data analysis platform. It helps enterprises get actionable insights from their data faster. MECBOT, the company’s flagship product, is an Open Cognitive Platform.

It transforms the way data analytics, deep learning, artificial intelligence (AI), predictive analytics, and IoT interface with each other on a near real-time basis. This leads to superior efficiencies, smart monitoring, and intelligent resource management. The startup claims to derive actionable insights from data in half the time and at one-third the cost.

Talking about the development, Suresh Srinivasan, co-founder of FORMCEPT said, “AI empowered by big data is disrupting the entire data analysis space. MECBOT, with its core capabilities and being built using Deep learning and AI techniques to handle all forms of data, is at the right inflection point on the global growth trajectory.”

The startup will use the newly raised funds to accelerate global growth. The company claims that it has a handful of Fortune 1,000 clients. It is planning to expand its technology team in India and cater to international markets.

Sanjay Randhar, Managing Director, GVFL Limited said, “As the explosive growth of data generation continues across industries, enterprises will immensely benefit from having a unified data analytics platform that can perform real-time analytics on both structured and unstructured data. We believe FORMCEPT’s product will deliver this value proposition to enterprises and help drive faster data analytics to gain better ROI.”

FORMCEPT recently worked with ESPN Cricinfo, to develop an end-to-end, comprehensive query-enabled platform that enables access to searchable cricket big data and statistics in a user-friendly manner.

In April 2017, customer data analytics software company Flytxt raised about $11 Mn funding from DAH Beteiligungs GmbH, a company related to the Hopp family office. Earlier this week, Quantta Analytics raised an undisclosed amount of funding in its Pre-Series A round. The round was led by undisclosed entrepreneurs and investors from India and Silicon Valley. Other startups in the data analytics segment along with FORMCEPT include IntelligenceNODE, Indus Insights, Axtria, BRIDGEi2i, DataCultr, etc.

Note: We at Inc42 take our ethics very seriously. More information about it can be found here.

Recent Posts