11 Best Web Analytics Tools

When considering the different web analytics tools that your business requires, the plethora of available options can be overwhelming for businesses that may not understand how to use them. And that’s where hiring someone to really dig into all of the reports can be vital.

The rule that is often referenced in this regard is the 90/10 rule, so if you have $100 to spend on analytics, spend $10 on reports and data, and $90 on paying someone to filter through all of that information. Because without a proper understanding of the information these services will provide you with, it remains just raw data.

“Investing in people and the tools that those people need to be successful is key,” notes Bryan Eisenberg, author and marketing consultant. “But it’s the people who can understand that data that really matter.”

You obviously won’t use all of these tools all of the time, but it’s beneficial to know about some of the top options and how they fit into your overall web strategy. And using multiple tools only gives you further levels of insight into your customers and your success rate.

According to Avinash Kaushik, author of Web Analytics 2.0 and Web Analytics: An Hour A Day, “the quest for a single tool/source to answer all your questions will ensure that your business will end up in a ditch, and additionally ensure that your career (from the Analyst to the web CMO) will be short-lived.” So in short, it’s of extreme importance to focus on multiplicity.

For larger businesses, the more robust analytics tools can be great to really dig in, but for small and mid-sized companies, there are many free or relatively cheap offerings to help you understand this information.

We interviewed Eisenberg, Christopher Penn of Blue Sky Factory, Caleb Whitmore of Analytics Pros, June Dershewitz of Semphonic, Eric Peterson of Web Analytics Demystified, Linda Bustos of Elastic Path Software, Jamie Steven, Rand Fishkin and Joanna Lord of SEOMoz.org, Trevor Peters of Critical Mass and Justin Levy of New Marketing Labs. These experts know the tools inside and out and this guide contains their recommendations on the best services for you to use.
11 Best Web Analytics Tools: What is Web Analytics?

But before digging in to the tools themselves, let’s start with exactly what web analytics is? As Kaushik states in his book with the same title, Web Analytics 2.0 is defined as:

1. The analysis of qualitative and quantitative data from your website and the competition

2. To drive a continual improvement of the online experience of your customers and prospects

3. Which translates into your desired outcomes (online and offline)

Web analytics 2.0 is a three-tiered data delivery and analysis service for small and big businesses. The first is the data itself, as it measures the traffic, page views, clicks and more for both your website and for your direct competition. The second is what you do with that data, or how you are able to take the information gathered via these services and apply it to your customers, whether new or existing, to make their experience meaningful and better. And the final tier is how it all circles back together to meet your overarching business objectives, not just online but offline as well. Data by itself is a great way to see how you are performing, but without applying what you’ve learned, it has little use.

Dig Deeper: Three Ways to Get a Web State of Mind

11 Best Web Analytics Tools: Clickstream Analysis Tools

Google Analytics (google.com/analytics) – Free

A completely free service that generates detailed statistics about visitors to your website, Google Analytics is the simplest and most robust web analytics offering. Currently used by over 50% of the top 10,000 websites in the world, according to the site’s usage statistics, you can find out where your visitors are coming from, what they’re doing while on your site and how often they come back, among many other things. As you get more involved in the site’s analytics, you can receive more detailed reports, but it’s that ease of use that makes it one of the most popular services.

“There’s really only one tool for small businesses need and that’s Google Analytics,” notes Penn. “It’s so incredibly robust in terms of what it offers and if someone tells you that Google Analytics isn’t enough for a small business, then frankly they have no idea how to use it properly.”

Google Analytics was the unanimous favorite of all the web analytics experts we talked to.
-Recommended by all experts

Yahoo Web Analytics (web.analytics.yahoo.com) – Free

Once you’ve mastered Google Analytics, Yahoo’s similar offering gives you a little more depth in your surveying. It offers better access control options and a simpler approach to multi-site analytics, raw and real time data collection (unlike Google, you can import cost of goods data), visitor behavior and demographics reports and customized options as well. Yahoo Analytics is a bit of a step up from Google in terms of profiling, filtering and customization, so for those looking to dig a little deeper, it’s a great option.
-Recommended by Whitmore, Bustos, Eisenberg

Crazy Egg (crazyegg.com) – $9-$99/month

In short, Crazy Egg allows you to build heat maps and track your visitors every click based on where they are specifically clicking within your website which is a long way of saying that you’re exploring your website’s usability. It allows you to really see what parts of your site users are finding most interesting and clicking on the most. It can help you to improve your website design and in essence conversion. Setup is quite simple as well, and their 30-day money back guarantee on all accounts is a nice touch.
-Recommended by Whitmore and Dershewitz

Dig Deeper: How to Use Google to Improve Your SEO

11 Best Web Analytics Tools: Competitive Intelligence Tools

Compete (compete.com) – Prices vary

Perhaps best known for publishing the approximate number of global visitors to the web’s top one million websites, Compete is a great complimentary tool to clickstream analytics offerings. Compete gives you creative intelligence on what your competitors are doing or how your users ended up on your website in the first place (what their clicks were both before and after). There is a free offering that includes traffic volume data. But where Compete is different is in their search analytics, a paid service that lets you track what keywords are sending users both to your website and to your competitors.

“The deeper digital insights you have, the better understanding you have of your customer,” says Aaron Smolick, senior director of marketing at Compete. “By using Compete products, you will have all of the information that you need to make educated decisions to optimize your online campaign, increase market share and dominate the competition.
-Recommended by Dershewitz, Eisenberg and Levy

Dig Deeper: How to Keep Tabs on the Competition

11 Best Web Analytics Tools: Experimentation and Testing Tools

Google Website Optimizer (google.com/websiteoptimizer) – Free

Another free tool from the folks at Google, their Website Optimizer is a complex testing service that allows you to rotate different segments of content on your website to see which sections and placement convert into the most clicks, and at the end of the day, the most sales. You can choose what parts of your page you want to test, from the headline to images to text, and run experiments to see what users respond best to. And of course, with GWO being free (you don’t even need Google Analytics to use it), it could be the only A/B (a technical term for multiple versions of the site running at once) and Multivariate (MVT) or complex testing solution.

“While not web analytics proper, Google’s Web Site Optimizer is the perfect companion to measurement and allows small business owners to test simple (A/B) and complex (multivariate) variations of their site, content, and landing pages using powerful statistical methodologies,” says Peterson. “While set-up is somewhat involved, the user interface is delightfully easy to learn and, of course, the service is available at the best of all prices — free.

Google Website Optimizer was another unanimous favorite from our panel of web analytics experts.
-Recommended by all

Optimizely (optimizely.com) – $19-$399/month

A relatively new service (launched in June 2010), Optimizely is simple to use but its results can be quite powerful. In essence, it’s an easy way to measure and improve your website through A/B testing. As a business, you can create experiments with the site’s very easy-to-use visual interface. The beautiful thing about this service is that you need absolutely zero coding or programming background, as the tools are easy for anyone to use.
-Recommended by Whitmore and Eisenberg

Dig Deeper: How to Use Google Apps to Improve Your Business

11 Best Web Analytics Tools: Voice of Customer Tools

Kissinsights from Kiss Metrics (kissinsights.com) – Free to $29/month

One of the easiest tools you can implement (it literally takes a one-time Javascript install), the idea behind Kissinsights is to provide businesses with an easily implemented and customized feedback form for website visitors. On the businesses end, you can manage all of the questions you’re asking customers through a single and simple dashboard. The best part of Kissinsights is that your customer feedback comes in via very simplified and short comments.
-Recommended by Whitmore, Eisenberg, Levy, Steven, Fishkin and Lord

4Q by iPerceptions (4qsurvey.com) – Free

A 100% free online survey solution that allows you to truly understand the “Why” around your website’s visitors, the premise behind 4Q is basically to learn what people are doing while on your website. Surveys are a powerful way to glean important insight from your customer’s actual experiences on your site, and they offer short and simple surveys that answer the four key questions you want every customer to answer:

“¢ What are my visitors at my website to do?

“¢ Are they completing what they set out to do?

“¢ If not, why not?

“¢ How satisfied are my visitors?

-Recommended by Whitmore, Dershewitz, Bustos and Eisenberg

ClickTale (clicktale.com) – Free to $990 (3 months free on paid plans)

A qualitative customer analysis, Clicktale records every action of your website’s visitors from their first click to the last. It uses Meta statistics create visual heat maps and reports on customer behavior, as well as providing traditional conversion analytics.

“One of the things that Google Analytics doesn’t do particularly well is tell you what visitors are paying attention to on a page and highlighting where those visitors are getting stuck during their visit,” says Peterson. ” Clicktale is essentially a video recorder for web site visits and provides great detail about mouse movement, scrolling, and dozens of other critical visitor behaviors.”
-Recommended by Whitmore, Peterson, Eisenberg, Steven, Fishkin and Lord

Nasscom identifies key job roles in big data analytics

The National Association of Software and Services Companies (Nasscom), as part of its reskilling initiative for the IT industry, has identified key job roles in the big data analytics domain.
IANS | June 24, 2017, 08:53 IST
NewsletterA A

Nasscom identifies key job roles in big data analyticsHyderabad: The National Association of Software and Services Companies (Nasscom), as part of its reskilling initiative for the IT industry, has identified key job roles in the big data analytics domain.

At the fifth annual Big Data & Analytics Summit, which concluded here on Friday, the industry body identified six areas of specialisation in the big data analytics domain.

Business analysts, solution architects, data integrators, data architects, data analysts and data scientists are expected to be key to sectors growth in times to come, Nasscom said in a statement.

“With the rising requirement for niche competencies in AI and analytics, the skill/expertise of the IT workforce will spearhead the analytical transformation on critical business processes across the industry. Nasscom’s Reskilling initiative will partner with the industry to identify the best curriculum against these job roles,” it said.

As per Nasscom’s Strategic Review 2017, analytics export market grew by nearly 20 percent in FY2017. The emergence of big data phenomenon in corresponding technologies is giving rise to new trends in the analytics domain.

The three dominant factors driving analytics and big data, and business intelligence investments are – making enterprises more customer-centric, sharpening focus on key initiatives that lead to entering new markets; and creating new business models, and improving operational performance.

The two-day event — themed ‘AI and Deep Learning: Transforming Enterprise Decision Making’ — brought together some of the finest minds in the Indian big data and analytics industry.

They discussed the paradigm shift that artificial intelligence, machine learning and analytics are driving within India Inc’s corporate strategy.

Dialogues centred on themes ranging from autonomous driving to cognitive computing, with an emphasis on the versatility of AI applications across banking, healthcare, governance and infrastructure sectors to name a few.

Among some of the most prominent faces at the summit was 14-year-old Tanmay Bakshi, world’s youngest IBM Watson Coder who delivered a keynote as-well-as conducted a Young Coders workshop.

ndia has job vacancies for 50,000 data analytics professionals: Study

According to a study by online analytics training institute Edvancer, close to 50,000 positions related to analytics currently available to be filled in India. This is expected to rise to 80,000-1,00,000 in 2018.
ByM Saraswathy

India has job vacancies for 50,000 data analytics professionals: Study

Jobs in data analytics are up for grabs. According to a study by online analytics training institute Edvancer, close to 50,000 job vacancies related to analytics are currently available in India. This is expected to rise to 80,000-100,000 in 2018.

The study, Analytics & Data Science India Jobs Study 2017, was carried out over a period of six months by Analytics India Magazine, in association with Edvancer Eduventures.

Aatash Shah, Founder & CEO, Edvancer Eduventures said that last year, there was a demand for 20,000 to 25,000 analytics professionals last year which has now risen 50,000 in this year.

“There is a huge shortage of skilled talent in the analytics space. Demand is outstripping the current supply,” he added.


According to the study, the median salary being offered by advertised analytics jobs in India is Rs 10.5 lakh per annum. It further said that 28 percent of all analytics jobs offers a salary range of Rs 6 lakh-Rs 10 Lakh. And about 24 percent of the jobs in analytics are ofered a salary of Rs 3 lakh-Rs 6 lakh.

Shah explained that an absolute fresher trained in the analytics space can get a package of Rs 6 lakh per annum. He said that this increases much faster with the years of experience.

In terms of sectors, banking & financial sector continues to be the biggest influencer in analytics job market. About 46 percent of all jobs posted on analytics were from banking sector. A year ago, this stood at 42 percent.


Among companies, the ten leading organisations with the most number of analytics job openings this year are Amazon, Citi, HCL, Goldman Sachs, IBM, JPMorgan Chase, Accenture, KPMG, EY & Capgemini, according to the study.

“Automation, which is the buzzword currently, involves machine-learning and this is a part of data science. This is replacing a lot of traditional jobs,” said Shah. He added that companies want people who can implement it practically and banking is one of the largest users of analytics.


When it comes to other sectors, e-commerce has dipped in terms of analytics jobs this year. About 10 percent of analytics jobs were in the e-commerce sector as opposed to 14 percent a year ago. The media/ entertainment sector, on the other hand, contributes 7 percent of all analytics jobs as opposed to 4 percent a year ago.

Shah said that they are seeing a growth of almost 200 percent every year and there has also been an increase in the number of students being trained every month.


“Compared to worldwide estimates, India contributes just 12 percent of open jobs opening currently. The no. of jobs in India are likely to increase much faster versus the rest of the world as more analytics projects get outsourced to India due to lack of skills across the world,” the study said.

Source: Analytics & Data Science India Jobs Study 2017

Health IT analytics helps optimize big physician practice’s operations

BOSTON — When a growing pediatrics practice with 32 locations in New York started looking at new geographic markets for its business, it designed a heat map of potential patients using an advanced health IT analytics system.


Download Now: Artificial Intelligence in Healthcare

AI in healthcare goes beyond IBM Watson. In this e-guide, discover 4 uses for AI in healthcare – particularly how it can help improve patient engagement – and whether we can overcome security and interoperability concerns surrounding the technology.
Corporate E-mail Address:
Download Now
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

Allied Physicians Group PLLC, based in the hamlet of Melville in Long Island, N.Y., is in the second year of a project to extend health IT analytics from Dimensional Insight Inc. (DI) across its business and management operations and, eventually, to its clinical services.

“What we want to do is have a full-breadth analytics package that we can use at our physician group, and really run our business optimally,” said Robert Creaven, executive vice president of operations at Allied, in a presentation at DI’s annual user group conference here in early June.

About 150 people representing dozens of DI customers attended the two-day conference at the Hyatt Regency Boston Harbor hotel.

The physician-owned Allied employs about 150 pediatricians and clinicians. Employees throughout the practice — from front-desk staffers to doctors — use the analytics program, which features a simple user interface. And while the business is built around 32 individual profit centers, the umbrella group handles key business functions including IT, HR, administration, accounting, taxes, purchasing and marketing.

Find more PRO+ content and other member only offers, here.
Hospitals prescribe engagement for better patient-doctor relationship
Patient population management calls for technological approach
A key driver of the ongoing move to analytics throughout the enterprise is the shift to incentive-based contracts with the group’s insurance payers, and away from fee-for-service, Creaven said.

What we want to do is have a full-breadth analytics package that we can use at our physician group, and really run our business optimally.
Robert Creaven
Allied Physicians Group
“This is part of the reason we needed a tool like DI,” Creaven said. “We have to capture that data better and report to the insurance companies we work with, and report all the quality numbers that are available, and ensure that the quality monies that are supposed to be coming to us are accurate and timely.”

DI, a health IT analytics vendor based in Burlington, Mass., has made significant inroads in healthcare since it was founded 25 years ago. While DI is used widely in the food and beverage and supply chain management verticals, its Diver analytics program has seen broad adoption in business intelligence and clinical applications in healthcare.

While not a native health IT company, DI consistently ranks among the top in healthcare business intelligence vendor surveys, alongside leading health IT analytics vendors and major IT vendors such as Health Catalyst, McKesson, Microsoft, IBM, Oracle, Information Builders and SAP.

Robert Creaven, Allied Physicians GroupRobert Creaven
Meanwhile, Creaven noted that Allied is surrounded by big hospital systems that are constantly buying up physician practices, but it wants to retain its independence, and a powerful health IT analytics tool can help them do that.

“The way we grow is by having provider-partner-owners who want to have a say in their business and its direction join us and not be controlled by outside forces,” he said.

A couple of challenges the practice faced when it was shopping for an analytics system was keeping costs down and obtaining software it could configure and adapt as the practice grew and added new specialties.

After it installed Diver and validated data from a legacy business intelligence system, Allied adopted a data governance program that could accommodate individual practices, but also benefit the entire group.

As configured for the practice, the program has several major tabs:

an executive summary view from a revenue-expenditure perspective, either by individual practice or the group overall;
production, from office visits to hospital rounding and patient populations;
operations, which is a function used by office managers;
medical coding compliance; and
a measure dictionary that defines all the data elements in the system.
Michael Marturiello, Allied Physicians GroupMichael Marturiello
Another Allied executive, Michael Marturiello, senior vice president of finance and accounting, said the health IT analytics system provided another advantage to the far-flung group: transparency.

The group’s 65 partner-owners can look at divisional budget numbers inside the corporate general ledger at any time and, thus, have more information to plan expenditures such as bonuses or vaccine purchases. This capability reduces the complexity of business decision-making for the doctors, Marturiello said.

GOP data firm that exposed millions of Americans’ personal information is facing its first class-action lawsuit

A data-analytics firm hired by the Republican National Committee last year to gather political information about US voters accidentally leaked the sensitive personal details of roughly 198 million citizens earlier this month. And it’s now facing its first class-action lawsuit.
Deep Root Analytics, a data firm contracted by the RNC, stored details of about 61% of the US population on an Amazon cloud server without password protection for roughly two weeks before it was discovered by security researcher Chris Vickery on June 12.

The class-action lawsuit, filed by James and Linda McAleer of Florida and all others similarly situated, alleges Deep Root failed to “secure and safeguard the public’s personally identifiable information such as names, addresses, email addresses, telephone numbers, dates of birth, reddit.com browsing history, and voter ID number, which Deep Root collected from many sources, including the Republican National Committee.”

The data exposed by Deep Root included 1.1 terabytes “of entirely unsecured personal information” compiled by Deep Root and at least two other Republican contractors, TargetPoint Consulting, Inc., and Data Trust, according to an analysis from the cybersecurity firm UpGuard.

“In total, the personal information of potentially near all of America’s 200 million registered voters was exposed, including names, dates of birth, home addresses, phone numbers, and voter registration details, as well as data described as ‘modeled’ voter ethnicities and religions,” UpGuard said.

The lawsuit says that President Donald Trump “is on record denouncing these sorts of breaches as ‘gross negligence.'”

It says that “as a direct and proximite cause of Deep Root’s conduct,” those exposed in the data breach may be vulnerable to identity theft and “a loss of privacy,” and argue that the “actual damages” exceed $5 million.

The exposed information did not include highly sensitive information like Social Security numbers, and much of it was publicly available voter-registration data provided by state government officials, a company spokesman told Business Insider on Tuesday.

“Since this event has come to our attention, we have updated the access settings and put protocols in place to prevent further access,” Deep Root said in a statement. “We take full responsibility for this situation.”

Deep Root didn’t immediately respond to a request for comment Wednesday.

But the exposed database combined people’s personal information and political inclinations – including proprietary information gathered via predictive modeling tools – to create a detailed profile of nearly 200 million Americans that would be a “gold mine” for anyone looking to target and manipulate voters, said Archie Agarwal, the founder of the cybersecurity firm ThreatModeler.

“This is the mother lode of all leaks,” Agarwal said Monday. “Governments are made or broken on this. I don’t even have the words to describe it.”

Joe Loomis, the founder and chief technology officer at the cybersecurity firm CyberSponse, predicted that a series of lawsuits against Deep Root over the accidental leak would prove damaging.

“Even if it was human error and not intentional, one IT person is probably going to put this company out of business,” he said.

Verimatrix Underscores Commitment to Verspective Operator Analytics through Device-Level Data Collection Acquisition

SAN DIEGO, June 23, 2017 /PRNewswire/ — Verimatrix, the specialist in securing and enhancing revenue for multi-network, multi-screen digital TV services around the globe, today announced the acquisition of the MiriMON technology and development team from Genius Digital, the expert in audience analytics for TV. This state-of-the-art client device data collection technology has already been integrated within the Verimatrix Verspective™ Operator Analytics solution suite and ViewRight® Ultra downloadable app packages. As a part of Verspective, the technology provides real-time service quality and customer experience visibility at the device level, as well as feeds the Verspective platform that provides powerful competitive insight for next-generation video services. This acquisition enables a closer alignment with Verspective and its security foundations and full control over the roadmaps for the core data collection technology and service dashboard displays.

“This is a natural addition to our Verspective Operator Analytics solution, which offers subscriber intelligence tools that emphasize data security and integrity as a foundation for actionable intelligence,” said Tom Munro, CEO of Verimatrix. “We see the real-time client data as a key piece to help our customers gain a holistic picture of how their services are performing and how subscribers are interacting with those services. We are excited to advance our Verspective solution and reinforce our relationship with Genius Digital.”

The award-winning Verspective Operator Analytics has a broad, “silo-busting” approach to data collection for today’s video operators, offering connections to video-on-demand (VOD) servers, content distribution networks (CDNs), and security and content management solutions. But an especially important data source in such systems is the instrumentation of client devices themselves, which provide a detailed view of live and on-demand consumption, as well as subscriber/device interactions. The Verspective client data collection technology, at the heart of this acquisition, enables a secure source of return path data for both linear and adaptive bitrate (ABR) services to a scalable data collection resource that can either be cloud-based, virtualized or server-hosted. Combining all this data yields viewer insights which help operators reduce subscriber churn and create new revenue streams.

Along with the data collection technology, Verimatrix will add a new core analytics engineering team that will be based in a development center in Bristol, U.K. Verimatrix will continue its long-standing partnership with Genius Digital with a focus on expanding the applications of Genius Digital’s Insight and Analytics services, enabling both data-driven subscriber analytics and enhanced advertising services.

“We are convinced that data security will be key to enabling the data-driven pay-TV operator and we are delighted that our data collection technology will continue to be integrated as part of the Verspective Operator Analytics suite,” said Tom Weiss, CEO of Genius Digital. “We are looking forward to continuing our partnership with Verimatrix to provide actions from data to TV service providers worldwide.”

For more information on Verspective Operator Analytics, visit www.verimatrix.com/analytics.

About Verimatrix
Verimatrix specializes in securing and enhancing revenue for multi-network, multi-screen digital TV services around the globe and is recognized as the global number one in revenue security for connected video devices. The award-winning and independently audited Verimatrix Video Content Authority System (VCAS™) family of solutions enables next-generation video service providers to cost-effectively extend their networks and enable new business models. The company has continued its technical innovation by offering the world’s only globally interconnected revenue security platform, Verspective™ Intelligence Center, for automated system optimization and data collection/analytics.

Its unmatched partner ecosystem and close relationship with major studios, broadcasters and standards organizations enables Verimatrix to provide a unique advantage to video business issues beyond content security as operators introduce new services to leverage the proliferation of connected devices. Verimatrix is an ISO 9001:2008 certified company. For more information, please visit www.verimatrix.com, our Pay TV Views blog and follow us @verimatrixinc, Facebook and LinkedIn to join the conversation.

How Analytics Has Changed in the Last 10 Years (and How It’s Stayed the Same)

Ten years ago, Jeanne Harris and I published the book Competing on Analytics, and we’ve just finished updating it for publication in September. One major reason for the update is that analytical technology has changed dramatically over the last decade; the sections we wrote on those topics have become woefully out of date. So revising our book offered us a chance to take stock of 10 years of change in analytics.

Of course, not everything is different. Some technologies from a decade ago are still in broad use, and I’ll describe them here too. There has been even more stability in analytical leadership, change management, and culture, and in many cases those remain the toughest problems to address. But we’re here to talk about technology. Here’s a brief summary of what’s changed in the past decade.

The last decade, of course, was the era of big data. New data sources such as online clickstreams required a variety of new hardware offerings on premise and in the cloud, primarily involving distributed computing — spreading analytical calculations across multiple commodity servers — or specialized data appliances. Such machines often analyze data “in memory,” which can dramatically accelerate times-to-answer. Cloud-based analytics made it possible for organizations to acquire massive amounts of computing power for short periods at low cost. Even small businesses could get in on the act, and big companies began using these tools not just for big data but also for traditional small, structured data.


Putting Data to Work
Analytics are critical to companies’ performance.
Along with the hardware advances, the need to store and process big data in new ways led to a whole constellation of open source software, such as Hadoop and scripting languages. Hadoop is used to store and do basic processing on big data, and it’s typically more than an order of magnitude cheaper than a data warehouse for similar volumes of data. Today many organizations are employing Hadoop-based data lakes to store different types of data in their original formats until they need to be structured and analyzed.

Since much of big data is relatively unstructured, data scientists created ways to make it structured and ready for statistical analysis, with new (and old) scripting languages like Pig, Hive, and Python. More-specialized open source tools, such as Spark for streaming data and R for statistics, have also gained substantial popularity. The process of acquiring and using open source software is a major change in itself for established businesses.

The technologies I’ve mentioned for analytics thus far are primarily separate from other types of systems, but many organizations today want and need to integrate analytics with their production applications. They might draw from CRM systems to evaluate the lifetime value of a customer, for example, or optimize pricing based on supply chain systems about available inventory. In order to integrate with these systems, a component-based or “microservices” approach to analytical technology can be very helpful. This involves small bits of code or an API call being embedded into a system to deliver a small, contained analytical result; open source software has abetted this trend.

The Explainer: Big Data and Analytics

<span>What the two terms really mean — and how to effectively use each.</span>
This embedded approach is now used to facilitate “analytics at the edge” or “streaming analytics.” Small analytical programs running on a local microprocessor, for example, might be able to analyze data coming from drill bit sensors in an oil well drill and tell the bit whether to speed up or slow down. With internet of things data becoming popular in many industries, analyzing data near the source will become increasingly important, particularly in remote geographies where telecommunications constraints might limit centralization of data.

Another key change in the analytics technology landscape involves autonomous analytics — a form of artificial intelligence or cognitive technology. Analytics in the past were created for human decision makers, who considered the output and made the final decision. But machine learning technologies can take the next step and actually make the decision or adopt the recommended action. Most cognitive technologies are statistics-based at their core, and they can dramatically improve the productivity and effectiveness of data analysis.

Of course, as is often the case with information technology, the previous analytical technologies haven’t gone away — after all, mainframes are still humming away in many companies. Firms still use statistics packages, spreadsheets, data warehouses and marts, visual analytics, and business intelligence tools. Most large organizations are beginning to explore open source software, but they still use substantial numbers of proprietary analytics tools as well.

It’s often the case, for example, that it’s easier to acquire specialized analytics solutions — say, for anti-money laundering analysis in a bank — than to build your own with open source. In data storage there are similar open/proprietary combinations. Structured data in rows and columns requiring security and access controls can remain in data warehouses, while unstructured/prestructured data resides in a data lake. Of course, the open source software is free, but the people who can work with open source tools may be more expensive than those who are capable with proprietary technologies.

The change in analytics technologies has been rapid and broad. There’s no doubt that the current array of analytical technologies is more powerful and less expensive than the previous generation. It enables companies to store and analyze both far more data and many different types of it. Analyses and recommendations come much faster, approaching real time in many cases. In short, all analytical boats have risen.

However, these new tools are also more complex and in many cases require higher levels of expertise to work with. As analytics has grown in importance over the last decade, the commitments that organizations must make to excel with it have also grown. Because so many companies have realized that analytics are critical to their business success, new technologies haven’t necessarily made it easier to become — and remain — an analytical competitor. Using state-of-the-art analytical technologies is a prerequisite for success, but their widespread availability puts an increasing premium on nontechnical factors like analytical leadership, culture, and strategy.
Thomas H. Davenport is the President’s Distinguished Professor in Management and Information Technology at Babson College, a research fellow at the MIT Initiative on the Digital Economy, and a senior adviser at Deloitte Analytics. Author of over a dozen management books, his latest is Only Humans Need Apply: Winners and Losers in the Age of Smart Machines.
This article is about ANALYTICS

Enterprises seek operational insight from network analytics

Shamus McGillicuddy, an analyst with Enterprise Management Associates Inc. in Boulder, Colo., said he sees opportunities for companies to achieve operational insight with network analytics.
Network data is emerging as a critical asset to boost enterprise performance. In retail, healthcare and finance, almost every business-critical function passes over the wire. According to McGillicuddy, IT teams are beginning to leverage packet flows not only to determine IT performance, but also to gain operational insight.

EMA research indicated that nearly half of network managers deploy advanced analytics tools to network data, primarily for network security and optimization. But 27% use these tools for business process optimization, determining factors such as underuse of point-of-sale devices or the impact of Wi-Fi in a retail environment.

According to McGillicuddy, network service providers will be the first to focus on operational insight through network analytics, followed soon after by enterprises. The Linux Foundation is already working on the PNDA.IO network analytics project for service providers. “But network managers will need more than analytical insight. They need tools that help them present that insight to the business. Reports and dashboards that are designed for consumption by non-technical personal will be critical for impactful network analytics,” McGillicuddy wrote in a blog post.

Read more of McGillicuddy’s thoughts on gaining operational insight.

Anticipating threats in the wake of WannaCry

Jonathan Care, an analyst with Gartner, said he views Linux as the next great vulnerability now that the WannaCry ransomware attack has stalled. Care said professionals need to be aware of the many embedded systems that rely on Linux, such as point-of-sale terminals, TVs, cameras, routers and medical equipment.

In the wake of the WannaCry-EternalBlue exploit, a new Unix-Linux vulnerability, dubbed EternalRed, is coming to light. The vulnerability has existed since 2010 and is only patched in recent distributions.

According to Care, WannaCry was an important reminder about basic hygiene factors in IT security, such as vulnerability management and patch installation. “Industries across the board are vulnerable. I’ve talked above about IoT [internet of things], web applications, cloud, and as we’ve seen in EternalBlue there are many vulnerable systems out there that have been forgotten,” Care wrote in a blog post.

“We have to take action and control, no matter what industry we’re in,” he added. Advanced attacks will be the new normal and therefore companies must take basic infrastructure safety steps, focus on a secure development lifecycle and ensure vigilance on the part of fraud managers and incident responders.

Dig deeper into Care’s thoughts about Linux vulnerabilities.

Slow adoption for 25 GbE switches

Carlos Cardenas, writing in Packet Pushers, looked into the reasons for slow adoption of 25 Gigabit Ethernet (GbE) switches. According to Cardenas, the new 28 GHz standard — the basis for both 25 GbE and 100 GbE — permits backward compatibility to 12 GHz standards.

Find more PRO+ content and other member only offers, here.
Need an app performance monitor? Learn what’s new in APM tools
Networking development takes on voice quality, SD-WAN security
Future networks jettison traditional networking thinking
However, to reach faster speeds, auto-negotiation and forward error correction (FEC) were also needed, which proved to be a big challenge for vendors that wanted to create interoperable 25 GbE switches. Broadcom, for example, wasn’t able to support FEC in every single lane of its 25 GbE chips, although it was able to in its 100 GbE devices.

By focusing on chip debugging, Mellanox was able to deploy its ASIC to interoperate with both standards. Cardenas said he expects Barefoot Tofino and Cavium XPliant to be able to interoperate because they were introduced after Mellanox, when the new standards had been ratified. More vendors are now moving toward interoperability with Tomahawk and potentially also with ASIC vendor Cavium. Cardenas projected that 25 GbE top-of-rack switching will soon begin to take off.

Google won’t add featured snippets analytics in Search Console

Google is not actively working on a tool within Google Search Console for publishers to get analytics about their featured snippets.

That’s according to Gary Illyes, a Google webmaster trends analyst, who made the comments at SMX Advanced this morning during his keynote conversation with Danny Sullivan.

About a year ago, we caught Google testing featured snippets within Google Search Analytics. But those tests must be done now, because Illyes said they are not actively working on this feature.

He did say they have a solution to show publishers this data. “We have something that we think would answer those questions, but we cannot release it,” Illyes said.

They cannot release it because the people who make the decisions at Google, Illyes’ bosses, he said, won’t allow them to release it. “Some people don’t want to see this feature this launched for various [reasons].”

What he suggested is that webmasters should communicate to Google what you would do with that data so he can bring that sales pitch back to his bosses. They need to find a use case where it would help publishers make better content.

Illyes did add that Google is working on a solution to show publishers when their web pages are used by Google by voice search. But they have no such plans for featured snippets.

Illyes later added that featured snippets data is not dead, but there is a “political” reason why they are not launching it yet. We, SEOs and Gary, need to convince Google to launch it.

Powering up assets and reinventing utility operations with analytics

Asset-intensive and field-force driven utilities are taking a major shift with the advent of smart meters, sensors, intelligent devices and IoT-based systems. However, these enterprises are facing challenges in managing enormous volumes of data generated by assets, multiple MDMs, ERP/CRM and SCADA systems. If leveraged efficiently, this data can provide a better understanding of consumers, assets, and demand and supply operations. However, this seems to be an impossible mission with the traditional data management systems that are being used across utilities.

What are the problems at hand?

The main issues include:

No real-time visibility to track and monitor consumption data, which would help utilities and end consumers be aware of their wastages and associated costs.
Absence of a scalable data management system for utilities and meter providers, so storing and managing the humungous volume of data records collected from millions of meters is a challenge.
No system to send alerts/notifications to report leakage, theft, overconsumption and utility loss, whenever it happens. Hence, utility providers and consumers are unable to take any actions to overcome the problems.
Lack of proactive maintenance and remote asset management systems to manage the condition and working capability of assets.
No predictive intelligence system to predict events, failures and outcomes for enhancing operational efficiency.
Therefore, utilities are facing an incessant need to reinvent their operations and realize the power of data systems. Consider a scenario wherein a utility gets to know about the failure of assets and operations well in advance. This would help the utility take corrective actions and measures or plan its pre-maintenance activities before the event occurs. However, utilities can realize this business value by adopting the prescience and predictive analytical capabilities, which would help them optimize asset performance, reduce downtime errors and enhance customer satisfaction.

The analytics solution

Research says that 32% of the traditional utilities are revamping their existing data management systems with advanced data analytics solutions that transform humongous amounts of data into intelligent insights and actions. They enable capabilities to monitor AMI/AMR systems, get real-time insights on utility distribution and consumption, optimize utility consumption and discover anomalies. Also, utilities can promote consumer awareness and enforce energy conservation by gleaning insights from consumers’ monthly utility usage patterns.

While serving its water utility customers, a smart meter manufacturer and utility service provider encountered several data systems-related challenges. However, it addressed all these challenges by implementing a scalable and robust smart metering data analytics solution which:

Manages millions of smart meters, which send billions of records every year
Provides real-time visibility of utility consumption, meter health, leakage/theft, hourly consumption data, etc.
Has a notification engine to send consumers text-message alerts, alerting them on abnormally high usage or leaks
Generates real-time insights from meter data to prevent over-consumption, leakage or wastage of utility
Has every minute of utility consumption tracking and visibility on a dashboard for the operations team
Several Azure components such as IoT Hub, Event Hub, SQL DB, HDInsight, machine learning and stream analytics have been leveraged to architect this solution. While this particular solution addresses specific water utility business problems, similar analytics solutions can be developed to address the problems of the entire energy and utilities market.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

Recent Posts