Monthly ArchiveJune 2017

How to Understand User Behavior With Google Analytics

admin No Comments

Want more conversions? Want to make smarter and better decisions about your SEO and other marketing activities?

Then you need to analyze what people do on your website.

Google Analytics is the single best tool to help you understand user behavior .

Here are three ways you can use Google Analytics to understand exactly what people do when they land on your website.

Use In-Page Analytics to Analyze Website User Behavior

In-Page Analytics is a neat Google Chrome extension that is useful for a couple of reasons:

Quick Access

In-Page Analytics lets you instantly connect to your Google Analytics account without having to log-in to the traditional interface (assuming you’re already logged into Google Chrome). This makes the process so much faster if you want to do a quick check to see how a particular page on your site is performing.

Like the traditional interface, you can select a date range for the data you’d like to see and instantly access page views, average time on page, and bounce rate all while you’re surfing your website.

If you really want to get fancy, you can also view your data by specific segments: all users, new users, organic traffic, paid traffic, and mobile traffic, just to name a few. You can even see how many people are on your site at that exact moment.

Page Analytics Google Chrome extension

Heat Map

What I love most about the extension is the heat map functionality. You can see exactly what percentage of website visitors click on any clickable area of your site. This is so useful because you can see if people are actually clicking on all of the buttons, links, etc. on any given page.

Let’s say you have a button with a call to action that says “read more”. You’ll be able to see the percentage of people who actually click on that button.

You may find out that your “obvious” call to action may not actually be so obvious. Based on the results from the heat map, you can make tweaks and improve the overall user experience of your site.

Check out this screenshot of the footer of my site:


My most clicked on links are to my services, portfolio, and blog. So much for the theory that your about us page is the most popular page on your site.

This is why understanding your analytics and the data specific to your site is critical to understanding how your audience and your website users behave. Knowing general trends can be useful, but nothing is better than knowing and understanding your own stats.

Set Up Conversion Goals to Track Website User Behavior

A goal allows you to measure how often a user takes a specific action on your site.

Google Analytics goals let you track the conversion rate of a particular action you want your website users to take. Goals can be actions (e.g., signing up for a newsletter, signing up for a free gift, making a purchase, or watching a video).

Why is this important?

It allows you to translate your business objectives into measurable actions within your website. Instead of just knowing how many people land on a page, you’ll know what specific action they’re taking (or not taking) on that page.

This information will give you valuable insight into what calls to action you may need to tweak or possibly remove from any given page.

To set up a goal, go to Admin > Goals > New Goal > Select template or Custom (I prefer Custom).

Next, give your goal a name and select the type of goal:

Destination: The goal is complete when a user lands on a particular page.
Duration: The goal is based on how long it takes a user to complete an action.
Pages/Screens per session: The goal is complete when a user browses a certain number of pages.
Event: The goal is complete when a user triggers an event (such as watching a video).
Lastly, you’ll enter the specific details based on the type of goal you selected and how much detail you would like to see.

For example, you can assign a monetary value to a goal to view the approximate amount of revenue you’ll generate for each completed goal.

Setting up goals in Google Analytics

My favorite type of goal for tracking website user behavior is a destination goal. It’s simple to configure and it can track visitor actions such as signing up for a free lead magnet.

So as an example, let’s say you have a sign up for a free gift located on several pages within your site. A Google Analytics goal will measure the percentage of users who are actually signing up for your free gift for each page that has your sign-up form. This allows you to not only see how many people are signing up for your free gift but more importantly, you’ll see how many people are signing up on each page.

Google Analytics puts your data into context to easily see which pages are converting the best. Knowing this, you can send traffic directly to the high converting pages and split test to improve the performance of the lower converting pages.

Often times people think their largest traffic sources correspond to their largest number of conversions, but that’s not necessarily the case. By only looking at your traffic reports and not setting up goals, you can miss this piece of the puzzle.

To put this into perspective, let’s say your traffic reports tell you that the majority of your traffic is coming from Pinterest. But when you view your goals, you may see that the majority of your conversions are actually coming from search engine traffic. So even though you may have less search engine traffic than Pinterest traffic, based on your goals the search engine visitors are more inclined to opt-in to your list. Tracking your goals will allow you to easily determine this and you can then adjust your marketing strategies accordingly.

View Behavior Reports to Understand Website User Behavior

Behavior reports are the best tools to use for diving deep into website user behavior.

While it’s great to know the type of people who visit your site (Audience Reports) and how people find their way to your site (Acquisition Reports), the best reports for understanding what actions people are taking when they’re on your website is the suite of Behavior Reports.

Google Analytics Behavior Reports

Your Behavior Reports can answer these questions and much more:

Are people actually following the user path you intended?
What page or blog post are people using the most to sign up for your newsletter?
Which pages are people staying on for the longest and least amount of time?
What are people searching for in the search bar on your website?
What events (such as watching a video) are people doing (or not doing) on your site?
Which pages are converting the best?
Which calls to action are not converting and should possibly be removed from the site?

Knowing how people behave when they’re on your site will help you understand your conversions and how you should structure your marketing campaigns to increase your ROI and overall success.

Analytics and the NFL: Finding Strength in Numbers

admin No Comments

Paraag Marathe arrived in San Francisco with a very clear directive from the 49ers, to reimagine the Jimmy Johnson draft chart. And after a couple months work, the senior associate from the Boston consulting firm Bain readied to show team president Bill Walsh and GM Terry Donahue his findings, in disbelief at the results being spit back at him.
It was uncanny.
“I tried to use historical trends and true value,” says Marathe, now the Niners’ chief strategy officer and EVP of football operations​. “And it wasn’t like Coach Walsh was telling me, ‘Hey, a third rounder has more value than it says here, and a second rounder has lower value.’ It was meant to be totally independent. And once I was finished, I looked at all of Coach Walsh’s trades over the years. And it was a total match with all of Coach Walsh’s trades.”
All that work, and all that was proven was what some guy on his couch down the street in Santa Clara could’ve told them: Walsh was a savant.
That was 2001. Moneyball—the best-selling book based on the baseball team across the Bay from the Niners, which would shed light on the coming numbers craze—was still two years from release. The idea of using advanced statistics to drive decision-making in baseball was still in a nascent stage, at least publicly. And that, of course, implies the truth about the NFL then, which is that few in football had even given that concept a thought.
It’s a different time now. Analytics in the NFL have moved well beyond the point where a team hiring a consulting firm to run numbers constitutes outside-the-box thinking. Yet, there remains resistance, a battleground of thought, and a general cloudiness on how far you can take numbers, and how far numbers can take you.
And just the same as 16 years ago, you can place a genius in the center of it.
Perception holds that the Patriots are among the league’s most progressive teams, but there’s precious little evidence of their investment. They have people who are responsible for advanced statistics, but coaches and scouts are largely charged with integrating data gathered into their work. The “analytics guy” there is 64-year-old Ernie Adams, a former Wall Street trader and prep school buddy of Bill Belichick’s.
So why are people so convinced that the five-time champions are knee-deep in the numbers? As one informed long-time NFL exec explains it, “It’s because they’re completely consistent with what sophisticated analytics would tell you to do.”
“[Belichick] does it with intuition,” says one AFC executive. “You know because you’ve been coaching for so long, how you match these 11 guys against those 11 guys. It all makes sense to you. At some point, maybe we can all come to those conclusions without having Bill Belichick’s brain. We’re still a long way from that.”

The NFL is getting closer. The MMQB spent a month discussing analytics with more than 40 team officials from across the league—coaches, executives, scouts and analytics people—and there are some hard conclusions that can be made on where the league is.
1. Most teams don’t shy away from analytics. In fact, more than one official was offended by the notion that their team would be called “old school”. When it comes to player acquisition (which is what Moneyball was based on), the average NFL team is using the data. It’s just that it is being used to generate boundaries rather than drive decisions. Teams want to know when they’re making exceptions on one player, and they want to know what they might be missing on another they may have otherwise dismissed.
2. On the coaching side, analytics are generally used to make staffs more efficient. There may only be time for quality-control coaches to break down four or five of an opponent’s games in the week they have leading into a particular game. And that, in the past, would lead to guesswork on tendencies and strengths and weaknesses. The data allows the quality-control guys, and staffs, to crosscheck against larger sample sizes.
3. The limits in those two areas are the number of games (16) and the variance in players’ assignments and situations that affect plays. That makes it more difficult to collect the amount of data necessary to make hard decisions.
4. Conversely, the value of the data in those areas is proven in that nearly the entire NFL has subscribed to Stats LLC and/or Pro Football Focus, and some rely on smaller services, like Pro Scout Inc., which is run by 64-year-old former Utah coach Mike Giddings.
5. There aren’t sure marks of analytics-friendly operations on game days (as there is in basketball, with teams that go for the “two-for-one” possession at the end of the quarter or half). But on the personnel side, you can see it in asset management, with teams that trade down in the draft to pad their margin for error, and use cap space creatively.
6. There is one strong consensus league-wide: Analytics data related to injury prevention, which straddles sports science and comes through player tracking, is useful and will only get better. The NFL is just scratching the surface with this technology, and the floodgates will truly open only when the league makes available all the Zebra data it’s been collecting. Another step here that’s expected to come eventually is in profiling the minds of players.
7. The Walsh/Belichick robot is not on the market. Yet.
• SMARTER FOOTBALL WEEK: A series examining the cerebral side of the sport, including technology, analytics, how a brainy linebacker prepares and just what goes into a typical NFL play.
* * *
The cliché is still going strong. On one side you have the gym teacher with the whistle around his neck, on the other it’s the dweeb with the taped glasses, pocket protector, and stacks of hard-to-decipher numbers and data. And, if you believe the cliché, it continues that football’s tough-guy culture has made it slower to change than other sports.
The truth is more complicated. Part of the problem with finding analytics’ place in football is the term itself. Often, the perceived hesitancy to embrace quantitative analysis in the NFL is due to the fact that what is often referred to as “analytics” was already a huge part of the game. It just didn’t have that name.
The game itself is rooted in tactics and strategy and details, and so the study of those has always been inherent in the coaching of the sport and building of its teams.
“I think most of the stuff we’ve done for a long time,” says one NFC general manager. “[Analytics] is the buzzword. That’s how everyone clumps it together now. Like, until five years ago, I’d never heard combine data called ‘analytics.’ But now, someone smart can make it look pretty—analytics. To me, analytics is the Pro Football Focus numbers, finding a way to put a numeric grade on every play and quantify it.
“No one has enough people on staff to do all of that. A lot of it is just statistics, and then what the smart computer guys can make it tell you.”
Photo: Carl Iwasaki/Sports Illustrated
Bill Walsh was ahead of his time on analytics, even if he didn’t know it.

There’s a lot of gray to cut through. Quality control coaches have forever done what are now considered game-week analytics—identifying opponents’ tendencies, finding trends, and setting up their bosses to game plan. On the personnel side, scouting assistants have, likewise, undertaken analytical studies for decades to uncover advantages and inefficiencies in the draft and free agency.
“Up until 15 years ago, football was probably the furthest along of any of the sports as far as studying the game itself,” says an analytics manager for one AFC team. “It just wasn’t guys from fancy schools doing regression. But there was real systematic study of the game. And frankly, basketball and baseball only started doing the kind of film study that football has done forever just recently.”
So here’s where you start: The reason there isn’t an NFL team ignoring analytics is because analytics has been done in football since Paul Brown came along.
Some teams have very little in the way of analytics staffing and, based on their business practices, are considered advanced. Others have entire departments in place and those who study analytics wonder, based on the way those teams operate, if all their number-crunching has influenced even a single decision.
How is that possible? Well, as Jaguars SVP of football technology and analytics Tony Khan—the son of owner Shahid Khan—explains it: “The adoption rate is far behind other sports.” More than three-quarters of NFL teams employ either a director of analytics or have a full-blown analytics department. Others have their cap managers oversee it.
The divide in the NFL comes in how the data collected is put to work.
“There are a lot of skeptics,” Khan says. “And that’s honestly probably on the analysts and the statisticians. You have to be able to explain it to the football people in their terms. That’s why I try to study that, to be able to communicate with coaches and scouts on a meaningful level, instead of forcing them to use our terminology.”
It’s easy to figure out where and why the divide happens. Football lacks the repetitive one-on-one situations that make up a large portion of baseball and basketball, and it’s tougher to figure out what each player’s assignment is without hearing it from a coach. Those assignments are more divergent, too—in basketball, a center’s role on defense can be affected by the opposing point guard’s movements and decisions and the spacing of all the opposing players; in football, a right tackle’s objective doesn’t relate much to what a free safety is deployed to do.
That makes putting values and creating apples-to-apples judgments on players difficult, and the idea of filtering gameday decisions through a strict set of rules problematic. It also makes it challenging for services like PFF and Stats LLC to find the right way to assess players and plays on their own, and generate value for coaches and scouts.
Then, there’s the issue of sample size.
“In baseball, you have 162 games, and the same exact play starts every play, and that’s the beauty of it,” says one AFC general manager. “You start to appreciate the sheer volume of the numbers, and how they compare over time. In basketball, there’s a natural progression—up and down, up and down. The event of the team going up the court happens so many times, you can chart shots, rebounds.
“And so much of it in both sports is pass/fail. You break that down with the amount of times the events repeat themselves, you can build big data.”
In football, on the other hand, you get anecdotes like this one: There is one team that has a club exec who heads up data collection. He’ll call his head coach into his office and hand the data to him. But the report never makes it to the coach’s desk, only out to the hallway where, between the two offices, there is a trashcan.
• WELCOME TO 1967 WEEK: Crazy players, an unforgettable expansion season and an ill-fated third-place game. A look at what the game, the players and the culture of professional football were like a half-century ago.
* * *
Some teams have analytics directors. Others have departments. Some integrate data and make football people responsible for working it into their jobs. Others keep the sides separate, with an over-the-top manager (usually the GM) sorting it out. Some build their own systems. Others have the numbers people there only as window dressing.
One commonality? Most subscribe to services. Cincinnati-based Pro Football Focus now lists 27 teams as clients, and founder Neil Hornsby says, “I’d be surprised if we haven’t got 32 of 32 by next season, if not this season.” Similarly, Stats LLC has 26 teams as subscribers. The reason? It’s a way to become more efficient and guard against missing anything, even if you’re still not sure how to use all the numbers.
“I come from a consumer product and enterprise software background,” says John Pollard, a former exec at Stats LLC who now works as a consultant. “And what struck me in the early 2000s is when more information and research and analytics became available in over-the-counter pharmaceuticals and retail, it was just too much information. And we’re going through the same thing in sports now.
“It’s being driven by the information. Technology provides the efficiency. And analytics provide more effectiveness in the decision-making. The goal isn’t to replace people. It’s to make them more effective and efficient.”
Tony Khan heads up the analytics operation for his father’s team in Jacksonville.
Photo: Bill Frakes/Sports Illustrated
Tony Khan heads up the analytics operation for his father’s team in Jacksonville.

On the personnel side, there are clear examples of where the numbers are leading to change. In 2009, Dolphins linebacker Joey Porter had nine sacks, but those sacks accounted for an unusually high percentage of his hurries. The Cardinals signed him, at 33, to a three-year, $24.5 million deal anyway. He regressed (6 sacks total over the next two seasons), as the numbers indicated he would, and Porter was out of football before the third year of the contract.
Last year, Bills linebacker Lorenzo Alexander, at 33, had 12.5 sacks. But similar to Porter, those numbers accounted for a high percentage of his pressures, adding to concerns that a clear outlier season in his career would remain one. He hit free agency, and returned to Buffalo on a two-year deal worth less than $6 million.
That’s not to say the aging Alexander would’ve broken the bank in a similar spot 10 years ago. But it did give teams a clear red flag without having to look at 16 games of tape while assessing hundreds of players in building a free-agent board.
And on the coaching side, it backstops a quality control guy’s work. Most QCs only get through four or five of an opponent’s games in a week’s work of prepping a report. Having the extra data on tendencies, situational play, and use of formations and personnel doesn’t mean that assistant can watch more games. It does mean he can crosscheck his findings from the ones he did see.
“The problem with football, they still base a lot of analysis on the previous four games,” Hornsby says. “So let’s say you’re playing Cincinnati, and you want to look at their tendencies when they’re in base personnel. You might wind up with 40 snaps out of 280. And then you’ll make a judgment. Well, of those 40, how many were on third down? How many came on second down?
“So what do you do if there are say, two or three snaps on second down out of base personnel? You have no option but to guess. What you can do with analytics, you’re growing the base data in areas that really matter.”
In many ways, it’s analogous to what technology like XOS has done for film study—where you can call up plays by down and distance, area of the field, personnel grouping, formation, even hashmark. Coaches can now crosscheck their study of four or five games through the data from full seasons, the same way they check out plays by situation without going through a zillion old beta tapes.

Coca-Cola to shut down Pune analytics centre

admin No Comments

NEW DELHI: Coca-Cola is downing shutters of its biggest analytics, technical and innovation centre in the country, based in Pune, as part of a restructuring undertaken by the beverages maker to become a leaner organisation.

The Eurasia analytical services facility, set up with an investment of Rs 18 crore in 2010 to support operations in India, South Asia, Eastern Europe, Southern Eurasia and the Middle East, was among the US-based firm’s six such centres globally.

“Over the past few months, you have been witnessing our organisation’s strategy getting repurposed and restructuring businesses across the world. In light of the same, the company has decided to consolidate lab facilities across the world and as a result, we plan to cease operations at the Pirangut, Pune lab location by December 2017,” an internal email circulated to senior employees said. ET has seen a copy of the email.

A Coca-Cola India spokesperson confirmed the move.

“This is subsequent to our announcement in February 2017 to design a new operating model that supports our growth strategy as we transform our business into a true total beverage company,” the spokesperson said.

“For the work the lab in Pune is doing today, the company will fulfil the majority of its analytical needs by using labs at its bottling operations and by leveraging third-party accredited analytical labs in India. Additional support for the Indian operations will also be provided from other laboratories of the company.”
When traveling, Safar Karo… Suffer nahi !Bajaj Allianz Ready-to-occupy Apartments In Valasaravakkam From 1.86 CrLancor Holdings Ltd. Recommended By Colombia

The company will provide compensation for all the impacted resources as per its policy and in adherence to local laws, he said.

The email also said all impacted employees at the centre will be “appropriately compensated”. The decision to close the centre was made as part of a comprehensive review to reshape and restructure businesses across the world, it added.

The centre was created as a ‘world class facility’ to offer analytical and technical support, and provide capabilities to enhance quality and food safety standards, drive innovation and enable Coca-Cola’s long-term strategic growth objectives as defined in the 2020 Vision, the company had said at the time of its debut.

Analytics Release for MangoMap

admin No Comments

Maptiks and MangoMap have announced the integration of user engagement analytics with MangoMap’s web map applications. Without any coding required, it is now possible to view in-depth analytics with key metrics on usability, performance and a visitor’s interaction with a MangoMap application. You can learn more about Maptiks at and MangoMap at

Will Cadell, CEO of Maptiks spoke about the integration: “MangoMap has grown its user base significantly over the past year, and we love their platform. We believe in making it easier for everyone to build beautiful maps that tell interesting stories. And MangoMap is a great tool to do just that, without having to actually write any code. Moreover, we’ve had many customers ask about getting an integration done with MangoMap, so as of today all MangoMap users can easily start tracking their visitors interactions.”

Analytics have become a critical tool to have in a geospatial developers toolbox over the last few years. Previously MangoMap only provided an integration with Google Analytics, but many of their users were requesting data regarding visitor interactions, something Google Analytics couldn’t provide. They turned to Maptiks to help.

Maptiks is purposefully built for web map applications. This allows it to track visitors and events on the map itself, allowing for more in-depth analysis. It provides user activity analytics for most modern web mapping platforms. Helping answer key design and U/X questions like “how and where are users interacting with a web map.”

About Maptiks

Maptiks, is a startup based out of Prince George, BC. Founded by Will Cadell, Maptiks is built by Sparkgeo, a geospatial web company with over a decade of experience in the GIS & Technology industries. Working with clients like Nextdoor, Wildlife Conservation Society, and Map My Fitness, Sparkgeo has built Maptiks to measure how users interact with web maps increasing map conversions so they can #buildabettermap.

NBA analytics: Going data pro

admin No Comments

For the NBA, like every other sports league, awards are important. They can generate attention, spur debate, make money, and involve fans, players, and experts, among others. Is there data science and analytics behind them — can there or should there be? We picked the NBA Most Improved Player award as an example to analyze some aspects of data-driven culture.

The NBA is announcing its yearly awards today. This is a much-anticipated event that has been talked about and analyzed extensively on sports media and beyond. Predictions and arguments on who should be nominated and who should win each award have been going on almost since the beginning of the season.

Keeping fans engaged is good, but there are more aspects to awards like these: They can give the media something to talk about, boost player and team statuses, and anyone can bet on the results.

Being part of pop culture, and having the potential to make or break careers and fortunes means there’s more to the NBA Awards that meets the eye. Let’s try and peek behind the looking glass and use data science and analytics to answer a question on many NBA fans minds: Who was the most improved player (MIP) in the NBA this season?

Define ‘improved’
To begin with, who gets to define improved, and how? As one NBA writer once put it: “There are few things more frustrating than trying to determine what it means to be the MIP”. On the other hand, that makes it interesting and open to interpretation. Since the NBA does not say much about its criteria and evaluation method, others have tried to come up with their own.

The traditional way NBA writers do this is by assembling a panel of experts and getting them to weigh in. Averaging expert opinions may be more on the objective side than just getting one opinion, but it still does not count as data-driven research in most data analysts’ books.

Adam Fromal from Bleacher Report argued that the MIP is “some years handed to a player who maintained his level of play (or even regressed slightly) while filling a much bigger role. Other times, the league rewards a contributor who made noticeable strides on both ends of the floor and did legitimately improve on a per-minute basis. Stars can win for reaching a new level, though the award often goes to a low- or mid-level rotation member who made the jump to legitimacy. Here, we’re accounting for everything by remaining entirely objective.”

That’s a strong claim there. Here’s what Fromal did and what we can learn from this.

Fromal’s methodology was based on weeding out players who improved for no reason other than newfound opportunity, and grading players by how much they improved in two different overarching statistics. Fromal wanted to reward both players who get better on a per-minute basis and those who stagnate while filling bigger roles.

Fromal presented his analysis in what he called “a countdown that intentionally eschews subjectivity.” That has not always been well received by everyone. Fromal has received anything from profanity to accusations of bias, and he has also been hilariously plagiarized, as the copycat misinterpreted his results. But how well did Fromal do?
Fromal’s top three includes two out the three players nominated by the NBA for the MIP — Giannis Antetokounmpo on No. 2 and Rudy Gobert on No. 3. His No. 1 was Myles Turner, a player overlooked by pretty much anyone else. Fromal missed Denver’s Nicola Jokic, who for most analysts and fans was an obvious contender.

This may give Fromal some objectivity credit, as he is a Denver resident, but begs the question as to where did the NBA and data-driven analysis part ways. The answer perhaps lies in what Fromal himself notes: Sophomores (like Turner) are typically expected to improve. In other analyses, sophomores are excluded from the MIP discussion.

Still, how can Jokic not be in that list? Is it Fromal that’s missing something obvious, or the NBA that has its own way of thinking? Perhaps, more importantly, should it? Does the NBA see something Fromal’s analysis does not, or do people there make their choices not entirely based on data-driven criteria and methods?

Data, meet eyes
Fromal is a professional NBA writer, and although he does not have a formal background on data analysis, he seems to be doing a lot of that for his work. Jay Spanbauer on the other hand is not a pro by any means — just a Bucks fan who began looking at the game in a different way with the influx of math and data in the NBA. But Spanbauer’s data-driven analysis succeeded where Fromal failed.

Both analyses were done before the NBA announced MIP nominees, but Spanbauer’s preceded Fromal’s by a month and narrowed down the MIP battle between Jokic and Antetokounmpo. Not only that, but he also pointed to a difference between them that may also lead the NBA to give Antetokounmpo the MIP award in a race that seems mostly between those two: Defense.

Spanbauer used a metric called Defensive Win Shares to show the biggest difference between the two. He pointed out that despite defensive ability being difficult to calculate accurately, it can be seen that Jokic sits below the league average, while Antetokounmpo is over two-and-a-half times higher. Maybe it’s obvious now, but nobody else seems to have used data to nail this when Spanbauer did.

It may seem obvious now, but not many people thought about comparing MIP candidates based on their data and visualizing this for others to see. (Image: Jay Spanbauer)

That is a clearly defined metric and difference, but then why focus on these two players in the first place? Unlike Fromal, Spanbauer went with a combination of instinct and data:

“I think ultimately data should be used to ‘check’ what our eyes see. Anyone who watched the NBA this year saw the amazing leap Giannis Antetokounmpo made. A closer look at his numbers confirm this.

With nonstop coverage, blogs, Twitter, etc, there is enough information and enough discourse for a group of nominees to be fairly decided. I still have confidence in the traditional way nominees are selected, especially for an award as ‘open-ended’ or ‘fluid’ as the MIP. Case-in-point: The clear nominees for MIP in 2017 are Antetokounmpo and Jokic. Looking at numbers and crunching data would likely bring you to the same result.”
Except it did not — at least not using Fromal’s metrics and data. Which brings us to a key point: Even when something is based on data and has clear definitions, that does not make it a God-given truth. Data makes supporting a point of view more credible, and it can also allow discovering patterns that may be otherwise hard to spot. But data-driven does not necessarily equal indisputable.

Antetokounmpo has a back story worthy of Hollywood, is infinitely ambitious and yet keeps his feet on the ground (when not flying above the rim), is a fan and media darling, has been improving dramatically and has entered super-star status. You could say that was perhaps the most obvious choice possible, but going by numbers alone it would have been Myles Turner for MIP.
The problem with data-driven decision making
We already mentioned the “no sophomores for MIP” rule used by some NBA analysts. If the NBA would have gone for that, Turner would be justifiably not be a MIP nominee, but neither would be Jokic. So, if Turner’s number’s are better than Jokic’s, what is the NBA’s reasoning there?

That, or going with Myles Turner for MIP, is the kind of thing that can heat up debates. It can also serve to point out a couple of facts about data-driven decision making.

Coming up with the “right” criteria is hard and ad-hoc. So maybe the criteria for MIP should come down to what Fromal used. And maybe sophomores should be excluded, except in some cases. But then what would those cases be? What about players who make a comeback after a bad year? Or nodding to a player that could use the encouragement, or a market that the league wants to grow?

Whether any of the above are legitimate criteria — or if and how they are considered by the NBA — is open to interpretation. Sometimes such overall organizational goals and drivers are clear, sometimes they are not. But let us not forget: Organization executives have a huge influence on these, regardless of whether data is used to capture and evaluate them.

Going from criteria to metrics is hard and ad-hoc. Let’s suppose that someone has somehow narrowed down the MIP criteria and written them in stone. What is the metric that best expresses each one? And how should they be combined with each other to derive an overall score?

Even the most widely used metrics were derived by someone at some point and entail their creator’s bias and shortcomings — perceived or otherwise. In the case of basketball, probably the most widely known metric is the PER. Whether that is the best overall metric to capture a player’s ability and influence in the game is being debated.

There are more metrics, too, which are constantly evolving, and most of them require some degree of expertise in both the domain (basketball) and the techniques (data science) to be able to fully comprehend and evaluate.

DataOps is the culture and practice of using data and analytics to drive decision making in organizations. But it is not infallible. (Image: Qubole)

Having the right data for the job is hard and not a given. Some data used today to derive information about NBA players’ defensive ability, such as steals and blocks, were not recorded until the 70s. This reflects not just the rising importance of data everywhere, but also the evolution of the domain itself.

When the importance of defense in the game of basketball got more recognition, that data found its place. Gradually, more and more data is being added to the NBA arsenal, including visual and spatio-temporal data, hustle statistics, and social media content.

The process there is two-way. Sometimes someone will come up with an idea to quantify something for which there is no data, and sometimes the option to have some data available can be used in unforeseen ways.

Working with five year olds is hard, period. Perhaps unsurprisingly, not everyone that cares about the NBA gets or cares about data and analytics. MIP nominees have not expressed any sentiment toward such analyses, and not many fans seem to be out there doing what Spanbauer did.

Some might say fans and players are more like five year olds anyway, but the truth is if things are not simple enough that a five year old would get, NBA analytics will be at the state all other analytics are right now: Something a few experts and some enthusiasts can use, some others have heard of and can maybe follow, and for most remains mumbo-jumbo.

Like all analytics applications, to apply NBA analytics, the right data sources need to be found, data has to be processed and integrated, domain knowledge applied, analysis done, and results visualized and explained.

So, should the NBA be more transparent about the criteria for its awards? And what would be the result of doing this? Could it make everything deterministic, taking the fun — and money — out of it?

Going data pro
Spanbauer is not the first non-pro to engage with NBA data analysis. There is an array of NBA analytics enthusiasts, and a number of people who work professionally in the area. And the borders between the two are not always clear, as the Seth Partnow story shows. Partnow is an ex-blogger turned analyst who now works with the Bucks. John Hollinger, the person who introduced the PER, now works for the Grizzlies.

But what are people using NBA data and analytics for? It depends on who they are, what they are after, and what tools they have available. What you can do with bedroom analyses will only take you so far. For some things, high school math + spreadsheet/internet + casual fan knowledge + a few hours will do. For others, it’s probably more like a PhD + IBM Watson + basketball guru status + a few months.

We all know the film Moneyball, and many basketball fans are familiar with Kawhi Leonard’s progress through analytics. We also know how top teams in all sports are gradually becoming data driven, and we’ve seen IBM’s Watson being touted as the tool to help NBA teams. Some of us heard about the fallacy of the Hot Hand fallacy.

For teams, the first priority is to analyze the game of their own players and that of their opponents in order to improve and counter it respectively. Scouting for new recruits is also of importance, and at the end of the day, it all comes down to winning more games, which also qualifies for making more profit.

NBA teams seem to be using analytics applications across the spectrum: From understanding what has happened and why, to predicting what will happen, and to making it happen — descriptive, diagnostic, predictive, and prescriptive analytics.

Data in all shapes and sizes used everywhere, and the NBA could be no exception.

For betting enthusiasts, it’s not so much about the game itself, but mostly about trying to make the right prediction that will turn them to winners. For fans like Spanbauer, it’s mostly about getting more insight in the game. As a representative of a data-driven culture trickling down, his views are interesting:

“It’s difficult to ignore the role and affect advanced metrics and statistics have had in basketball – as well as other sports. While analytics isn’t the only way to analyze the game, I like to think of it as another lens through which to look.

I wouldn’t say that analytics is all about predictions. Or even results, to be honest. It’s about altering the traditional mindset of organizations, and the evolution of the front office. You see more money being spent on research, and more jobs opening up in the field of analytics.

At the end of the day, numbers are just numbers. A lot of attention is paid to them – sometimes more than necessary. The human element of the game cannot and should not be ignored. There are still intangibles that we have not been able to measure, and perhaps will not be able to measure.

That said, we should still continue to seek answers by utilizing as much data as we can. The more numbers and information fed into any model will provide more accurate results.

I don’t necessarily believe awards should have some sort of criteria. Awards are for the fans, and part of the fun of the awards is debating amongst other fans your opinion of who should or should not win. However, with award selections dictating salaries and benefits like the designated-player exception, some consideration into criteria should certainly be given.

There is enough information out there to justify spending money and using analytics to measure many different areas. Owners and general managers are on their own to decide whether or not they want to trust themselves, or the numbers.”

CIIE-IIMA Incubatee FORMCEPT Raises Funding From GVFL

admin No Comments

Data analytics startup, FORMCEPT Technologies and Solutions Pvt. Ltd has raised an undisclosed amount of Series A funding from VC fund GVFL (Gujarat Venture Finance Ltd).

The company is an incubatee of the Centre for Innovation Incubation and Entrepreneurship (CIIE) – IIM Ahmedabad. GVFL has invested in FORMCEPT through its recent GVFL Startup Fund. With this move Sankalp Bajpai, Vice President, Investments, GVFL Limited, will join the company Board.

Founded in 2011 by Suresh Srinivasan and Anuj Kumar, Bengaluru-based FORMCEPT is a unified data analysis platform. It helps enterprises get actionable insights from their data faster. MECBOT, the company’s flagship product, is an Open Cognitive Platform.

It transforms the way data analytics, deep learning, artificial intelligence (AI), predictive analytics, and IoT interface with each other on a near real-time basis. This leads to superior efficiencies, smart monitoring, and intelligent resource management. The startup claims to derive actionable insights from data in half the time and at one-third the cost.

Talking about the development, Suresh Srinivasan, co-founder of FORMCEPT said, “AI empowered by big data is disrupting the entire data analysis space. MECBOT, with its core capabilities and being built using Deep learning and AI techniques to handle all forms of data, is at the right inflection point on the global growth trajectory.”

The startup will use the newly raised funds to accelerate global growth. The company claims that it has a handful of Fortune 1,000 clients. It is planning to expand its technology team in India and cater to international markets.

Sanjay Randhar, Managing Director, GVFL Limited said, “As the explosive growth of data generation continues across industries, enterprises will immensely benefit from having a unified data analytics platform that can perform real-time analytics on both structured and unstructured data. We believe FORMCEPT’s product will deliver this value proposition to enterprises and help drive faster data analytics to gain better ROI.”

FORMCEPT recently worked with ESPN Cricinfo, to develop an end-to-end, comprehensive query-enabled platform that enables access to searchable cricket big data and statistics in a user-friendly manner.

In April 2017, customer data analytics software company Flytxt raised about $11 Mn funding from DAH Beteiligungs GmbH, a company related to the Hopp family office. Earlier this week, Quantta Analytics raised an undisclosed amount of funding in its Pre-Series A round. The round was led by undisclosed entrepreneurs and investors from India and Silicon Valley. Other startups in the data analytics segment along with FORMCEPT include IntelligenceNODE, Indus Insights, Axtria, BRIDGEi2i, DataCultr, etc.

Note: We at Inc42 take our ethics very seriously. More information about it can be found here.

11 Best Web Analytics Tools

admin No Comments

When considering the different web analytics tools that your business requires, the plethora of available options can be overwhelming for businesses that may not understand how to use them. And that’s where hiring someone to really dig into all of the reports can be vital.

The rule that is often referenced in this regard is the 90/10 rule, so if you have $100 to spend on analytics, spend $10 on reports and data, and $90 on paying someone to filter through all of that information. Because without a proper understanding of the information these services will provide you with, it remains just raw data.

“Investing in people and the tools that those people need to be successful is key,” notes Bryan Eisenberg, author and marketing consultant. “But it’s the people who can understand that data that really matter.”

You obviously won’t use all of these tools all of the time, but it’s beneficial to know about some of the top options and how they fit into your overall web strategy. And using multiple tools only gives you further levels of insight into your customers and your success rate.

According to Avinash Kaushik, author of Web Analytics 2.0 and Web Analytics: An Hour A Day, “the quest for a single tool/source to answer all your questions will ensure that your business will end up in a ditch, and additionally ensure that your career (from the Analyst to the web CMO) will be short-lived.” So in short, it’s of extreme importance to focus on multiplicity.

For larger businesses, the more robust analytics tools can be great to really dig in, but for small and mid-sized companies, there are many free or relatively cheap offerings to help you understand this information.

We interviewed Eisenberg, Christopher Penn of Blue Sky Factory, Caleb Whitmore of Analytics Pros, June Dershewitz of Semphonic, Eric Peterson of Web Analytics Demystified, Linda Bustos of Elastic Path Software, Jamie Steven, Rand Fishkin and Joanna Lord of, Trevor Peters of Critical Mass and Justin Levy of New Marketing Labs. These experts know the tools inside and out and this guide contains their recommendations on the best services for you to use.
11 Best Web Analytics Tools: What is Web Analytics?

But before digging in to the tools themselves, let’s start with exactly what web analytics is? As Kaushik states in his book with the same title, Web Analytics 2.0 is defined as:

1. The analysis of qualitative and quantitative data from your website and the competition

2. To drive a continual improvement of the online experience of your customers and prospects

3. Which translates into your desired outcomes (online and offline)

Web analytics 2.0 is a three-tiered data delivery and analysis service for small and big businesses. The first is the data itself, as it measures the traffic, page views, clicks and more for both your website and for your direct competition. The second is what you do with that data, or how you are able to take the information gathered via these services and apply it to your customers, whether new or existing, to make their experience meaningful and better. And the final tier is how it all circles back together to meet your overarching business objectives, not just online but offline as well. Data by itself is a great way to see how you are performing, but without applying what you’ve learned, it has little use.

Dig Deeper: Three Ways to Get a Web State of Mind

11 Best Web Analytics Tools: Clickstream Analysis Tools

Google Analytics ( – Free

A completely free service that generates detailed statistics about visitors to your website, Google Analytics is the simplest and most robust web analytics offering. Currently used by over 50% of the top 10,000 websites in the world, according to the site’s usage statistics, you can find out where your visitors are coming from, what they’re doing while on your site and how often they come back, among many other things. As you get more involved in the site’s analytics, you can receive more detailed reports, but it’s that ease of use that makes it one of the most popular services.

“There’s really only one tool for small businesses need and that’s Google Analytics,” notes Penn. “It’s so incredibly robust in terms of what it offers and if someone tells you that Google Analytics isn’t enough for a small business, then frankly they have no idea how to use it properly.”

Google Analytics was the unanimous favorite of all the web analytics experts we talked to.
-Recommended by all experts

Yahoo Web Analytics ( – Free

Once you’ve mastered Google Analytics, Yahoo’s similar offering gives you a little more depth in your surveying. It offers better access control options and a simpler approach to multi-site analytics, raw and real time data collection (unlike Google, you can import cost of goods data), visitor behavior and demographics reports and customized options as well. Yahoo Analytics is a bit of a step up from Google in terms of profiling, filtering and customization, so for those looking to dig a little deeper, it’s a great option.
-Recommended by Whitmore, Bustos, Eisenberg

Crazy Egg ( – $9-$99/month

In short, Crazy Egg allows you to build heat maps and track your visitors every click based on where they are specifically clicking within your website which is a long way of saying that you’re exploring your website’s usability. It allows you to really see what parts of your site users are finding most interesting and clicking on the most. It can help you to improve your website design and in essence conversion. Setup is quite simple as well, and their 30-day money back guarantee on all accounts is a nice touch.
-Recommended by Whitmore and Dershewitz

Dig Deeper: How to Use Google to Improve Your SEO

11 Best Web Analytics Tools: Competitive Intelligence Tools

Compete ( – Prices vary

Perhaps best known for publishing the approximate number of global visitors to the web’s top one million websites, Compete is a great complimentary tool to clickstream analytics offerings. Compete gives you creative intelligence on what your competitors are doing or how your users ended up on your website in the first place (what their clicks were both before and after). There is a free offering that includes traffic volume data. But where Compete is different is in their search analytics, a paid service that lets you track what keywords are sending users both to your website and to your competitors.

“The deeper digital insights you have, the better understanding you have of your customer,” says Aaron Smolick, senior director of marketing at Compete. “By using Compete products, you will have all of the information that you need to make educated decisions to optimize your online campaign, increase market share and dominate the competition.
-Recommended by Dershewitz, Eisenberg and Levy

Dig Deeper: How to Keep Tabs on the Competition

11 Best Web Analytics Tools: Experimentation and Testing Tools

Google Website Optimizer ( – Free

Another free tool from the folks at Google, their Website Optimizer is a complex testing service that allows you to rotate different segments of content on your website to see which sections and placement convert into the most clicks, and at the end of the day, the most sales. You can choose what parts of your page you want to test, from the headline to images to text, and run experiments to see what users respond best to. And of course, with GWO being free (you don’t even need Google Analytics to use it), it could be the only A/B (a technical term for multiple versions of the site running at once) and Multivariate (MVT) or complex testing solution.

“While not web analytics proper, Google’s Web Site Optimizer is the perfect companion to measurement and allows small business owners to test simple (A/B) and complex (multivariate) variations of their site, content, and landing pages using powerful statistical methodologies,” says Peterson. “While set-up is somewhat involved, the user interface is delightfully easy to learn and, of course, the service is available at the best of all prices — free.

Google Website Optimizer was another unanimous favorite from our panel of web analytics experts.
-Recommended by all

Optimizely ( – $19-$399/month

A relatively new service (launched in June 2010), Optimizely is simple to use but its results can be quite powerful. In essence, it’s an easy way to measure and improve your website through A/B testing. As a business, you can create experiments with the site’s very easy-to-use visual interface. The beautiful thing about this service is that you need absolutely zero coding or programming background, as the tools are easy for anyone to use.
-Recommended by Whitmore and Eisenberg

Dig Deeper: How to Use Google Apps to Improve Your Business

11 Best Web Analytics Tools: Voice of Customer Tools

Kissinsights from Kiss Metrics ( – Free to $29/month

One of the easiest tools you can implement (it literally takes a one-time Javascript install), the idea behind Kissinsights is to provide businesses with an easily implemented and customized feedback form for website visitors. On the businesses end, you can manage all of the questions you’re asking customers through a single and simple dashboard. The best part of Kissinsights is that your customer feedback comes in via very simplified and short comments.
-Recommended by Whitmore, Eisenberg, Levy, Steven, Fishkin and Lord

4Q by iPerceptions ( – Free

A 100% free online survey solution that allows you to truly understand the “Why” around your website’s visitors, the premise behind 4Q is basically to learn what people are doing while on your website. Surveys are a powerful way to glean important insight from your customer’s actual experiences on your site, and they offer short and simple surveys that answer the four key questions you want every customer to answer:

“¢ What are my visitors at my website to do?

“¢ Are they completing what they set out to do?

“¢ If not, why not?

“¢ How satisfied are my visitors?

-Recommended by Whitmore, Dershewitz, Bustos and Eisenberg

ClickTale ( – Free to $990 (3 months free on paid plans)

A qualitative customer analysis, Clicktale records every action of your website’s visitors from their first click to the last. It uses Meta statistics create visual heat maps and reports on customer behavior, as well as providing traditional conversion analytics.

“One of the things that Google Analytics doesn’t do particularly well is tell you what visitors are paying attention to on a page and highlighting where those visitors are getting stuck during their visit,” says Peterson. ” Clicktale is essentially a video recorder for web site visits and provides great detail about mouse movement, scrolling, and dozens of other critical visitor behaviors.”
-Recommended by Whitmore, Peterson, Eisenberg, Steven, Fishkin and Lord

Nasscom identifies key job roles in big data analytics

admin No Comments

The National Association of Software and Services Companies (Nasscom), as part of its reskilling initiative for the IT industry, has identified key job roles in the big data analytics domain.
IANS | June 24, 2017, 08:53 IST
NewsletterA A

Nasscom identifies key job roles in big data analyticsHyderabad: The National Association of Software and Services Companies (Nasscom), as part of its reskilling initiative for the IT industry, has identified key job roles in the big data analytics domain.

At the fifth annual Big Data & Analytics Summit, which concluded here on Friday, the industry body identified six areas of specialisation in the big data analytics domain.

Business analysts, solution architects, data integrators, data architects, data analysts and data scientists are expected to be key to sectors growth in times to come, Nasscom said in a statement.

“With the rising requirement for niche competencies in AI and analytics, the skill/expertise of the IT workforce will spearhead the analytical transformation on critical business processes across the industry. Nasscom’s Reskilling initiative will partner with the industry to identify the best curriculum against these job roles,” it said.

As per Nasscom’s Strategic Review 2017, analytics export market grew by nearly 20 percent in FY2017. The emergence of big data phenomenon in corresponding technologies is giving rise to new trends in the analytics domain.

The three dominant factors driving analytics and big data, and business intelligence investments are – making enterprises more customer-centric, sharpening focus on key initiatives that lead to entering new markets; and creating new business models, and improving operational performance.

The two-day event — themed ‘AI and Deep Learning: Transforming Enterprise Decision Making’ — brought together some of the finest minds in the Indian big data and analytics industry.

They discussed the paradigm shift that artificial intelligence, machine learning and analytics are driving within India Inc’s corporate strategy.

Dialogues centred on themes ranging from autonomous driving to cognitive computing, with an emphasis on the versatility of AI applications across banking, healthcare, governance and infrastructure sectors to name a few.

Among some of the most prominent faces at the summit was 14-year-old Tanmay Bakshi, world’s youngest IBM Watson Coder who delivered a keynote as-well-as conducted a Young Coders workshop.

ndia has job vacancies for 50,000 data analytics professionals: Study

admin No Comments

According to a study by online analytics training institute Edvancer, close to 50,000 positions related to analytics currently available to be filled in India. This is expected to rise to 80,000-1,00,000 in 2018.
ByM Saraswathy

India has job vacancies for 50,000 data analytics professionals: Study

Jobs in data analytics are up for grabs. According to a study by online analytics training institute Edvancer, close to 50,000 job vacancies related to analytics are currently available in India. This is expected to rise to 80,000-100,000 in 2018.

The study, Analytics & Data Science India Jobs Study 2017, was carried out over a period of six months by Analytics India Magazine, in association with Edvancer Eduventures.

Aatash Shah, Founder & CEO, Edvancer Eduventures said that last year, there was a demand for 20,000 to 25,000 analytics professionals last year which has now risen 50,000 in this year.

“There is a huge shortage of skilled talent in the analytics space. Demand is outstripping the current supply,” he added.


According to the study, the median salary being offered by advertised analytics jobs in India is Rs 10.5 lakh per annum. It further said that 28 percent of all analytics jobs offers a salary range of Rs 6 lakh-Rs 10 Lakh. And about 24 percent of the jobs in analytics are ofered a salary of Rs 3 lakh-Rs 6 lakh.

Shah explained that an absolute fresher trained in the analytics space can get a package of Rs 6 lakh per annum. He said that this increases much faster with the years of experience.

In terms of sectors, banking & financial sector continues to be the biggest influencer in analytics job market. About 46 percent of all jobs posted on analytics were from banking sector. A year ago, this stood at 42 percent.


Among companies, the ten leading organisations with the most number of analytics job openings this year are Amazon, Citi, HCL, Goldman Sachs, IBM, JPMorgan Chase, Accenture, KPMG, EY & Capgemini, according to the study.

“Automation, which is the buzzword currently, involves machine-learning and this is a part of data science. This is replacing a lot of traditional jobs,” said Shah. He added that companies want people who can implement it practically and banking is one of the largest users of analytics.


When it comes to other sectors, e-commerce has dipped in terms of analytics jobs this year. About 10 percent of analytics jobs were in the e-commerce sector as opposed to 14 percent a year ago. The media/ entertainment sector, on the other hand, contributes 7 percent of all analytics jobs as opposed to 4 percent a year ago.

Shah said that they are seeing a growth of almost 200 percent every year and there has also been an increase in the number of students being trained every month.


“Compared to worldwide estimates, India contributes just 12 percent of open jobs opening currently. The no. of jobs in India are likely to increase much faster versus the rest of the world as more analytics projects get outsourced to India due to lack of skills across the world,” the study said.

Source: Analytics & Data Science India Jobs Study 2017

Recent Posts