Archives January 2022

BigQuery Connector for SAP: Power your cloud data analytics strategy

BigQuery Connector for SAP: Power your cloud data analytics strategy

Google Cloud has a genuine passion for solving technology problems that make a difference for our customers. With the release of our BigQuery Connector for SAP, we’re taking a another big step towards solving a major challenge for SAP customers with a quick, easy, and inexpensive way to integrate SAP data with BigQuery, our serverless, highly scalable, and cost-effective multi cloud data warehouse designed for business agility.

Solving for simplified data integration

Like most businesses today, SAP customers are eager to unlock the immediate insights and opportunities within their ever-growing stores of business data. However, many are discovering just how hard it can be to take the first step in any modern, cloud-enabled data analytics strategy: combining SAP data with other cloud-native, and enterprise data sets in real-time and at scale. According to a 2020 SAPInsider study, more than half of SAP customers surveyed said data integration was their top analytics pain point. These companies urgently need a rapid, sustainable, cost-effective and scalable way to integrate SAP data with modern cloud data analytics solutions.

The BigQuery Connector for SAP gives our customers a solution: a fast, simple, cost-effective and massively scalable way to make SAP data fully accessible within BigQuery by leveraging customers’ existing SAP Landscape Transformation Replication Server (SLT) tooling and skill sets. It’s the first SAP SLT direct near real-time connector for BigQuery without the need to set up additional infrastructure or third-party middleware, and can be deployed using a variety of embedded or stand-alone deployment options. In fact, most customers can install the BigQuery Connector for SAP in less than an hour—a remarkably easy way to start working with our industry-leading analytics solution that delivers proven and quantifiable business advantages for customers. Additionally, the BigQuery Connector for SAP is not restricted to customers who have deployed their SAP applications on Google Cloud. Customer’s who are running their SAP applications on-premise, or on any cloud, can also deploy and realize the analytical benefits of the solution.

Designing a solution with customer requirements and investments in mind

When the Google Cloud team started work on an analytics data integration tool for our SAP customers, we began with a set of requirements designed to root out the usual sources of cost and complexity. These included: 

The need for real-time performance with deltas replicated in milliseconds

The ability to integrate data from almost any SAP Netweaver based application running today, regardless of its deployment location (on premises, any cloud, Google Cloud)

Automatic BigQuery data type mapping with minimal transformation required

Generation of target tables in BigQuery directly from source, if required 

Application layer integration that avoids the issues of direct database access

Leveraging customers’ existing SAP skillsets, change data capture, and infrastructure

An important step towards meeting these requirements came when Alphabet, Google’s parent company, decided to leverage SAP SLT as a foundation for developing direct data replication between SAP and BigQuery for its internal corporate landscape. SLT as part of SAP’s strategic Business Technology Platform, supports real-time replication of data from SAP or third-party systems to SAP HANA, however, one of its limitations was direct integration with targets like BigQuery. 

SAP SLT was a logical foundation for developing the connector for several reasons: 

It’s widely adopted among SAP customers who likely already leverage SLT for SAP analytics data integration

It works with almost every non-SaaS SAP application environment running today

It supports real-time replication performance at massive scale

It was an obvious choice for the Alphabet engineering team who saw immediate value from integrating SAP with BigQuery. 

“The BigQuery Connector for SAP has enabled fast, low latency data replication for billions of records from 500+ tables of our most critical financial and supply chain data. Now in one cost-effective BigQuery data lake, this ERP data can be combined with other data sources for previously impossible real-time analytics and ML use cases. This allows us to drive much deeper strategic insights that support business and operational excellence, management and P&L reporting and more.”—Anil Nagalla, Sr. Engineering Director, Financial Systems, Google

SAP data integration with BigQuery enables new value

By leveraging SAP SLT, the BigQuery Connector for SAP can integrate real-time data streams from any SAP system—while also taking advantage of customers’ existing SAP investments and skillsets. 

At the same time, the BigQuery Connector for SAP does a lot of heavy lifting on its own. For example, it automatically handles the complex, multi-step process of transforming SAP data types for use in BigQuery—mapping data-type transitions between the SAP and BigQuery environments, creating a target table schema on BigQuery for the transformed data types, building the target BigQuery table, and even adapting as new data types appear in your SAP environment.

For teams that want to fine-tune the BigQuery Connector for SAP’s automated recommendations, the connector supports additional levels of customization and choice. But if you simply want to get the job done and give your data analytics team greater support for their high-value work, then you’ll love just how quickly and easily the BigQuery Connector for SAP turns the complicated work of data integration and performance to process large volumes of data into a done deal. By integrating enterprise data sets in real time, customers can drive differentiated value and unlock new insights and actions that drive a competitive advantage. 

The BigQuery Connector for SAP really shines as an enabling tool that transports and transforms your SAP data to power analytics solutions enabled by accelerators like the Google Cloud Cortex Framework: a comprehensive set of reference architectures, deployment accelerators, and integration services designed to give SAP customers a fast and seamless path to value with their data analytics investments. Simply put, the more SAP data you make available within Google Cloud, the easier it is to get meaningful—and often game-changing—insights from these solutions.

Learn more about the BigQuery Connector for SAP

Ready to get started with your own SAP data analytics strategy on Google Cloud? Install the Google Cloud BigQuery Connector for SAP, and discover a faster, simpler, more sustainable way to power your company’s data analytics strategy.

Related Article

Design considerations for SAP data modeling in BigQuery

Learn how to model your SAP data inside of BigQuery, Google Cloud’s serverless data warehouse.

Read Article

Source : Data Analytics Read More

5 Product Strategy Consulting Tips for Data-Driven Marketing Campaigns

5 Product Strategy Consulting Tips for Data-Driven Marketing Campaigns

Big data is extremely important in the marketing profession. This is supported by the fact that companies around the world will be spending over $4.6 billion on marketing analytics by 2026.

A growing number of companies are using data analytics to better understand the mindset of their customers, provide better customer service, forecast industry trends and identify the ROI of various marketing strategies. They are also using AI tools like Canva and Hootsuite to automate many of their marketing tasks.

There is no disputing the fact that big data is invaluable to marketing. However, utilizing data analytics successfully can be a challenge. You need a well-articulated strategy for your marketing campaign to be successful. This is especially true if you are launching a marketing campaign for a new product.

Using big data to market your product is more important than ever. Having a breakthrough idea is no longer enough to get through with it on the market. Everyone has unbelievably great ideas, but offering them on the market and being successful is different. Whether you’re a small startup or an experienced company, you need to have a product strategy to make sure your idea is widely accepted. You won’t be able to sustain a competitive edge without a background in big data.

You need an experienced data-driven marketing strategist who knows everything about product strategy and will help you get from point A to point B with great success. To do that, you need to hire a product strategy consulting expert. In this article, we’re talking more about product strategy and what the consultant will help you with.

Coming Up with a Data-Driven Marketing Strategy for a New Product

First, let’s define what a product marketing strategy is and what the consultant will be helping you with. Then we will need to consider ways to incorporate big data into it.

Every product has its value, but it will be successful only depending on how dedicated you will be to delivering it to the market. The consultant will share their experience and give you a couple of tricks off their sleeves to get a better job done. They will also give some insights into how you can use big data to improve on it. Now let’s see what you need to mind when creating the right product strategy.

1. Make sure you have a clear goal.

Big data can be very valuable for product marketing. However, investing in new technology is not going to be valuable without the right strategy in place.

Everything in business is a marathon, and nothing good comes overnight. That’s why you need to set goals for your company and the particular product. Some of the most successful companies out there have set goals for years in the future and work on them daily.

Every product strategy must be wrapped around your business goal. Both must be headed in the same direction. If your main goal is something that your clients recognize as a great deal, your products will be successful.

Once you have outlined your strategy, you can start brainstorming ways to use data analytics technology to make the most of it.

2. Set a clear product mission with predictive analytics

The product’s mission and the company’s vision are deeply connected and intertwined. Every product must have a mission. The company must set an objective for what their product will achieve and how valuable it will be for the consumers.

The product’s mission can be different. For example, being reliable, providing income and making a considerable impact,. You intend to do with your product and something that will change the world or profoundly affect your clients.

You will need to be realistic about the goals you established for your product. This is going to be a lot easier if you use predictive analytics technology to better understand the trajectory of the market. You can use Google Trends and other predictive analytics tools for your marketing research to estimate the future demand and gauge evolving customer expectations. You will have more realistic standards and know how to position your product to achieve them.

3. Research the market

Before starting the campaign, it is essential to understand how the market works, what’s happening between the competition, and what you can expect from your product. You can’t launch a LED bulb project in a village without electricity or sell pork in a Muslim country.

Doing so means you’re destined to fail, and if you try the pork example, you might even face severe punishment too. This is a clear example of why researching the market landscape is vital before starting anything.

A number of data mining tools make it easier to understand the market. You can do your due diligence by using data analytics tools like Qlik and Resonate. You will avoid going into marketing blind.

4. Use customer data to better understand them

Aside from the competition, you need to know your customers too. They must research what they prefer, know how they can advertise the product, figure out what advertising channel will work best, and determine what other things should be done before going forward with the strategy.

You can use customer data from advertising platforms like Facebook or CRM platforms to better understand your customers. This way, you’ll learn what consumers prefer, how you can get their attention, what the best ways to prepare the campaign will be, and with it, find the best product strategy possible.

5. Use data analytics to understand every facet of your strategy before the launch

All these things mentioned should be placed in a sheet and analyzed correctly. You need to address every point separately and make sense of the entire research. You need to leverage data analytics strategically. The analysis will give you the big picture, and you’ll know the best approach to creating a product strategy.

Companies that gather information and analyze it make 30% more profits than those comfortable publishing their products without too much work. Spend some time looking into facts, and see how your product strategy is successful.

Big Data is Crucial for Marketing New Products

Developing a perfect data-driven product marketing strategy is complex, and if you can’t do it alone, you should hire a professional to help you. Consulting is a great way to get the job done correctly. If you’re launching a great investment product, you should ensure that it will be flawless. Every little detail during this process is valuable, so make sure you’re using all the pieces of the puzzle correctly. You will know what data analytics tools to use and how to implement them properly.

The post 5 Product Strategy Consulting Tips for Data-Driven Marketing Campaigns appeared first on SmartData Collective.

Source : SmartData Collective Read More

Worst Data Security Threats Remote Workers Can’t Ignore

Worst Data Security Threats Remote Workers Can’t Ignore

Data security is becoming a greater concern for companies all over the world. The pandemic has contributed to these issues. A number of hackers started targeting companies for data breaches during the pandemic, partly because so many employees were working remotely.

The frequency of data breaches is not likely to subside anytime soon. Many companies are making work-from-home models permanent, which means data threats are going to be as common as ever.

When you are working from home, you need to take stringent data security precautions. If you are a business owner, you need to also take the right data-driven cybersecurity measures.

Importance of Data Security While Working from Home

There was a time when working from home was a perk. However, currently, it’s become a need for the survival of the human race, as individuals around the globe isolate themselves from the Coronavirus. A recent survey also shows that more than 51% of US-based employees were forced to work from home during COVID 19 outbreak. Yet, during these exceptional circumstances, you and your routine job may begin to be presented with new types of cyber risks that attempt to exploit you and your representatives while you work remotely.

The biggest cybersecurity challenge for IT teams during the pandemic was ensuring remote workers have adequate access to remote workers but committing to a much higher level of data security is an excellent opportunity to improve business resilience.

That is the reason it’s highly essential than ever before that you begin to consider your remote workplace security. Remote working presents various data security challenges that the two workers and businesses ought to know about. But the good news is by following prescribed procedures for working remotely, the more significant part of these threats can be moderated without any hurdle.

Even a few decades ago, it would have been almost impossible to operate remotely for many businesses. Without the proper technology, the employee had to go to the office to do his job. The disadvantages of such technological advances were obstacles between work and home life. So maintaining a healthy work-life balance is crucial for many workers.

The ability to balance these two lives is the key to making us feel happier and more productive at work. By saving time spent on commuters that take so long, employees can achieve a better work-life balance and add hours to their days while overhead interruption from co-workers is reduced, ultimately increasing the productivity of employees remotely. This has been one of the biggest benefits of big data in recent years.

Remote work presents an extraordinary test for data security since remote workplaces don’t have similar protections as in the workplace. At the point when an employee is at the office, they are working behind layers of preventive security controls. While not 100% secure, it is more enthusiastically to commit a security error while at the workplace. Although, when a company’s system/devices leave the edge and individuals work distant, new dangers emerge for the organization, and extra security strategies are essential.

Remote employees are ordinarily the first to confront security threats. They’re the critical point from where a malicious actor tries to penetrate. Regardless of whether you have a remote worker in your organization or not, portable devices like cell phones, laptops, and other such devices present security hazards that should be addressed before hackers do.

Best Remote Security Practice for Businesses

Here are the best practices that can be followed to encounter remote data security risks and minimize the risk to its network.

Make sure your passwords are strong and secure

One of the easiest but often most common ways to protect your online presence (whether you are working from home or not) is to have strong passwords and ensure you maximize your passwords’ protection across your devices and for applications’ most recent multi-factor authentication is enabled. You can drastically reduce the risk of data breach by taking these simple steps.

Check-in Balance of Devices by Organization

It is highly critical to perform fencing to identify the assets that are bringing the threat to an organization. When the devices are not explored, it will be challenging to defend and protect the network from the threat and risks that the device is exposed to. Having an updated record of devices and their users can help to deploy security measures and make monitoring them for possible data breaches easier.

Phishing Emails

Phishing emails are the top picked weapon by hackers that helps them to target and attract users by manipulating them with lucrative offers and scams to get sensitive information like PII (Personally Identifiable Information), credit card details in some cases, or even login credentials by redirecting the user to look-alike version of the legit web application.

Such techniques are very common and advanced that are tough to detect in many cases. Therefore, an email security gateway should be placed to minimize such risk rather than leave all the employees responsible only. You need to be cautious about taking the right steps to safeguard emails to stop data security threats.

Using Unsecured Personal Devices and Networks

Due to multiple variants of operating systems and device hardware, it is complex to keep up with the updated and secure device network to carry out work remotely. Every unsecured device brings a new challenge to the business.

Also, the medium through which the devices are communicating is less secured as far as the office environment is considered. So, it should also be well secured to prevent any data loss or raising threats to devices.

Weak Backup and Recovery System

In the race to prevent a cyber incident, the most common step is to have a backup and recovery process in place prior. Many businesses commonly neglect backup and recovery practices, but it is the most effective technique to resume operations in cyber incidents. A weak or no backup and recovery plan could be disastrous, leading to catastrophe loss of data.

Video Attack

Popular video-conferencing application Zoom has been found vulnerable to a zero-day exploit. There are many other applications as well that could lead to such threats to businesses using them. It is recommended to have a recent patch installed for such applications to avoid any compromise.

Countermeasure for Bruteforce VPN Attacks

Remote working has exposed the organization to an ample amount of login attacks that include the Bruteforce VPN attack. Additionally, organizations often inactivate the built-in account lockouts function on VPN connectivity to preserve business continuity or ease overhead on IT, which makes it a viable option for attackers to penetrate from.

Attackers choose to attack the VPN and target a VPN portal and blow it with several authentication attempts, with a list of pre-compiled credentials. If any of the username/password combinations work, the attacker gets his footing. Not only that, if the target uses a single sign-on (SSO), the attacker also has a valid domain log. An attacker could quickly infiltrate the network, begin a discovery using domain logging, and try to increase privileges.

Best Remote Security Practice for Employer

Migrating Applications to Cloud

The main reason to have upgraded to cloud storage for web applications is to be remain updated with recent threat defence and also provides with the compliance of industry’s regulations. In addition, the managed security layer is added to the application, whereas backup is also not a hurdle over the cloud.

Require VPN to provide access to Employees

Virtual Private Network is the most recommended channel for remote users as it provides the same security defined and created for organizations by creating a private tunnel between remote employees and companies. It feels like the user is residing in the company as all the policies are pushed down to the remote user without any glitch.

However, attackers intimidate users during a pandemic to convince them to click on malicious links and download malicious programs. When you click on these malicious links, the attacker’s payload is downloaded, and the attacker connects to the command and control (C2) server. They will then begin to explore and extend privileges to find and steal your sensitive data.

Policies Related to BYOD should be Enforced

If the organization is following BYOD (Bring your own device) mechanism, then it should comply with the enforcement of policies related to BYOD and mobile device management to prevent any potential loss of information.

Use Multi-Factor Authentication

As the user is residing outside the organization, there are high chances that the user account gets compromised due to less surveillance. Multi-factor authentication will help to authenticate users more securely to ensure that no compromised user can get reach critical information or the company’s network.

Use Services to Manage Passwords

One of the biggest causes of data breaches is passwords being compromised. Managing passwords is a tough challenge for many organizations as it requires a solid data encryption mechanism and storage so no one can decrypt them. There are various services that can be referred to off-load the overhead from the IT team.

Educate Employee on Data Security Practices

While you can trust yourself and your knowledgeable employees to be safe online, it is worth noting that in these times, corporate computers are more likely to be exposed to young children and employees’ family members.

Therefore, you should gently remind employees to keep their devices secure and not allow other members of their families to access their laptops, cell phones, and other types of hardware at work.

No matter how many security controls you have to encounter and defend against a potential threat, if the end-user is not aware of it, your company is at high risk. As employees are the first line of defence against any attack, it is critical to have conduct awareness training periodically for employees.

Leverage to Insider Threat

Maximized exposure to corporate information over the remote device brings a strong potential of insider threat that the information can be misused by the employee for financial gain, business reputation harm, or for personal grudges.

Capturing internal threats is very difficult if an employee uses a personal device to access sensitive data because the device does not have corporate security controls such as DLP, which typically catches an insider that exfiltrates the data.

Best Remote Security Practice for Employee

An end-user is an employee who uses the hardware and software assets of your organization in order to perform their job duties. It includes people at all levels who comes from different background of knowledge.

A common practice of secure handling of information in their routine task is the only challenge to target and enhance the organization’s first line of defense. The following are some remote security best practices that end-user should follow:

Be Cautious for Phishing Emails

First of all, just don’t open any link/URL from unknown senders. Hackers always try to use clean email addresses with relevant subjects to trick users into clicking malicious links or redirecting them to fake input pages. Also, try to read out the informative updates from the IT department of an organization to remain updated and aware of the recent cyber threats.

Secure Conference Calling or online Video Sessions

While using online video conferencing tools like Zoom, it is highly recommended to use a solid and unique password to log in; this will minimize the risk if the application gets compromised no employee data will be exposed to danger.

Also, consider using a paid subscription for such access, as you get enhanced Zoom security features in paid addition. Other considerations could be limiting user access by having a lock option once the meeting is started, a waiting room option to let another user in manually, and also limiting the only host to share screen.

Execute Software Updates Timely

If you work with an effective IT team in an organization, you can install regular updates, run virus scans, block malicious websites, etc., and these activities can be transparent to you. The chances are that you did not follow the same protocols as your PC that is required at work. In addition, your company may be able to provide the higher level of technical inspections that you personally allow.

Your PC is not secure for workplace information without running in the background because a third party could compromise it. In essence, installing a personal computer, whether on a work network or remotely, endangers corporate networks and endangers itself, assuming potential liability for serious corporate damage despite violations of policies, practices, or both. The employee must have all the device software updated in the remote work environment, as unpatched software could lead to potential compromise. Enhanced security features and improved software stability are achieved when the software updates are carried out timely.

Usage of Strong Password

The employee should create a password using a combination of upper and lower case alphabets, special characters, and numbers with a minimum length of 8. Such passwords are hard to guess and complex to break as well using brute-force techniques.

Securing Your Belongings

It is observed that employee leave their belongings like a laptop, mobile phone, bags, and other stuff unattended while allowing anyone to compromise critical information from it. Therefore, an employee should always lock their devices when not in use.

Be Cautious with Wireless Connections

Public Wi-Fi poses a significant security risk and should be avoided where possible. If you need to access the Internet from a public Wi-Fi hotspot, you need to solve two basic problems. First, other people have access to this network, and without a firewall, between you and them, threat agents can be in front of your computer from all over the room.

Secondly, any interested viewer of your existing network or any other public network that strikes your data between you and your workplace can monitor your traffic. It is essential to find ways to protect your computer and encrypt traffic. Open wireless connections are easy doorways for malicious actors to penetrate and sniff into data being communicated from it. Such less secured Wi-Fi networks make it easier for hackers to carry out cyberattacks.

Segregating Office Work from Personal Work

An organization laptop is for a representative’s business usage. Employees’ personal tasks should be done independently from a company system or other cell phones as other applications like social media and entertainment applications bring a threat to company information if gets compromised. The employee should be advised to follow this practice.

Using Personal Device

Managing personal devices and office work remotely is a headache, and to avoid this, many employees use the same company laptop for their personal usage. This leads to serving unproductive and insecure web browsing. The more corporate device is used for personal work, the more it gets exposed to potential threats.

When you say “my work environment is safe,” you mean that you have taken reasonable steps to protect and ensure the integrity of the data, code, or other confidential information in your care. You have also taken steps to ensure that you or an unauthorized person do not exercise your rights to access sensitive information systems in a manner prejudicial to the organization that holds the information and the purposes of those systems.

Remote teams have a much larger attack surface than centralized teams. Unlike a centralized team, where you can physically lock confidential information behind firewalls and corporate workstations, such as telecommunications, you are advised or even asked to bring your own device or bring your own disaster. However, by applying the appropriate guidelines, you can certainly minimize the risk of infringement.

Security breaches happen occasionally, and while there are many such witnesses in the digital world, we can learn from them. It is also essential to provide a clear path for the security of all digital devices.

The sooner your organization can detect and respond to data breaches or even security incidents, the less likely it is to significantly impact your data, customer trust, reputation, and potential loss of revenue. If your organization does not have an event response process, consider creating one.

Any effective data breach is about revenue, reputation, customer confidence, and, most importantly, critical intelligence. While your organization may not have a Home Depot or Target, your small and medium-sized organization where users work remotely can have a significant impact on privacy.

Nevertheless, you can surely minimize the risk of potential cyberattacks on the remote side by following these easy-to-adopt strategies and best practices. Moreover, it is crucial to recognize that the future of work will be dynamic, and that security must meet the needs of a distributed workforce.

Take the Right Measures to Protect Against Data Breaches

Creating a flexible, secure hybrid environment with the same level of protection for employees to access critical services wherever they are. It is necessary to address priorities and modify the policies and controls used on the site to allow this.

The post Worst Data Security Threats Remote Workers Can’t Ignore appeared first on SmartData Collective.

Source : SmartData Collective Read More

Optimizing Your IT Budget While Running a Data-Centric Company

Optimizing Your IT Budget While Running a Data-Centric Company

Big data technology has become a very important aspect of our lives. More businesses than ever are transitioning to data-driven business models. Research has shown that companies with big data strategies are 19 times more likely to become profitable.

Unfortunately, some businesses have made poor decisions when instituting a data strategy. In a sense, despite its tremendous value, big data has become a bit of a bubble for many companies. They have heard that big data can be useful, so they invest in it without much consideration for their ROI.

If you are trying to create a data-centric business, you are going to need to invest wisely. You are going to need to come up with an IT budget that is geared towards making the most of your big data strategy.

In the world of IT, change is constant. This means that you have to be proactive about keeping up with trends in order to remain competitive. The good news is that there are a number of ways for CEOs and CIOs alike to do this without breaking the bank. This blog post will explore 5 ways to make the most of your IT budget while creating a data strategy in 2022.

#1 Equip Your Employees with the Right Data Analytics Tools

You can’t have a data-driven company with subpar technology at your disposal. there is no way that you will be able to sustain a competitive edge. One of the best ways to make the most of your IT budget is by equipping your employees with the right data analytics tools. This means investing in software, devices, and the best IT training courses that will help them do their jobs more effectively.

Not only will this improve productivity, but it can also lead to cost savings down the road. For example, investing in a good CRM system can help your sales team close more deals. This means more revenue for your company, which can far outweigh the initial investment.

CRM systems are so effective because they are heavily reliant on detailed customer data. You can use the analytics of your CRM system to identify customers that are most likely to convert, segment different customer groups based on your marketing strategies, automate various customer engagement practices and keep track of the buying stages different customers are in.

The same principle applies to other types of software as well including HR management systems and accounting tools. Allowing employees to use these programs can not only increase productivity but will also shorten billing cycles, allowing you to get paid faster. And that is always something worth investing in. Big data has changed these practices as well, but you have to make sure the right technology is available.

Tip: Be sure to consult with your employees before making any major purchases. They may have suggestions on what tools they need to do their jobs better.

#2 Invest in Cloud Computing

Another great way to make the most of your IT budget is by investing in cloud computing. This technology can help you save money on everything from hardware to software licenses. In addition, it can also help you improve efficiency and collaboration among employees.

Cloud computing is more important in the age of big data. You can’t store all of the data that you need on your own internal servers, so the cloud helps make it more scalable.

One of the best things about cloud computing is that it can be tailored to any industry. So, whether you’re looking for something that can handle heavy graphics or just need a simple online database, there is likely an option out there to meet your needs.

Tip: Be sure to do some research before committing to any cloud vendor. There are dozens of providers on the market and it’s important that you find one who meets your specific needs.

#3 Upgrade Your Network Infrastructure

One of the biggest ways to make the most of your IT budget in 2022 is through network infrastructure upgrades. These are often overlooked because they can be difficult and costly, but every business should have one.

For example, investing in a fiber-optic backbone will not only improve performance across all devices but also help you prepare for the future. This is because this type of infrastructure can help you handle bandwidth-intensive applications like virtual reality and augmented reality, which are quickly becoming business necessities.

Tip: When it comes to network upgrades, be sure to consult with your IT team before making any moves. They may have suggestions on what kind of backbone will work best depending on your current infrastructure and future needs.

#4 Utilize Virtual Assistants

Another great way to make the most of your IT budget is by utilizing virtual assistants. These are employees who work remotely and can help you with everything from manning your IT help desk to handling basic administrative tasks at a fraction of the cost of hiring a full-time employee.

While this may seem counterintuitive because it requires an extra investment, the benefits often outweigh the costs. For example, using a virtual assistant can help you reduce the amount of time and money spent on recruiting, interviewing, and training new employees.

Virtual assistants can also help you save money on office space, furniture, and benefits. Since they work remotely, this allows you to hire from anywhere in the world, which opens up a larger talent pool.

Tip: When hiring a virtual assistant, be sure to ask for references and/or check out their online reviews. This will help you ensure that you’re getting quality service.

#5 Create a Data-Centric Culture

Finally, one of the best ways to make the most of your IT budget in 2022 is by creating a data-centric culture. This means not only equipping employees with analytics software but also encouraging them to use it whenever possible.

For example, sending sales reps weekly reports on how they’re doing compared to their goals. Or having marketing teams track website visits and social media engagement. Not only will this help employees make better decisions, but it will also allow you to measure the effectiveness of your marketing campaigns and other initiatives.

Tip: Make sure that all data is accessible across the organization. This will ensure that everyone has access to the information they need to make informed decisions.

Use Your Budget Wisely When Coming Up with a Big Data Strategy

Big data is very important, but it won’t do you much good if you don’t invest in it wisely. Making the most of your IT budget in 2022 doesn’t have to be difficult. By following these tips, you can make the most out of your IT budget in 2022.

Upgrading your network infrastructure, utilizing virtual assistants, and creating a data-centric culture are all great ways to make the most of your budget. What are you waiting for? Start implementing these tips today!

The post Optimizing Your IT Budget While Running a Data-Centric Company appeared first on SmartData Collective.

Source : SmartData Collective Read More

Artificial Intelligence Is Influencing Everyday Lives for the Better

Artificial Intelligence Is Influencing Everyday Lives for the Better

Artificial intelligence is having a larger impact on our lives than you may think. Although only 38% of businesses use AI in some form, 90% of the most successful companies utilize some form of AI.

You may be wondering how significant AI really is. To some, AI may seem like any other over-hyped buzzword that has never truly manifested in the day-to-day human life. However, for those who have been following the AI conversation much more closely, it’s crystal-clear that this phenomenon is here to stay. It is very much the future of technology.

If you are one among the skeptics, keep reading to learn more about what artificial intelligence is and how it can impact your everyday life and business.

1. AI Makes Your Ride Shorter and Much More Convenient

Have you ever wondered how ride-sharing apps such as Lyft and Uber give near-perfect destination suggestions? How do they actually match you with other passengers to eliminate detours? You guessed right—it’s all courtesy of artificial intelligence. More specifically, Machine Learning.

In an interview with Hyperight, the tech leader at Uber, Ritesh Agrawal, explained how ML helps improve the overall riding experience. For example, the smart reply system One-Click chat allows drivers and riders to communicate on the go with robust in-app messaging. How so? Well, it uses NLP and ML to accurately anticipate responses to regular riders’ questions.

Details about frequently traveled destinations and ride history are also leveraged to yield personalized destination suggestions.

2. AI Checks for Plagiarism with Unmatched Precision

Plagiarism checkers have been around for quite a while, but it’s only recently that they’ve become extremely effective and dependable. Tools like Turnitin are used by instructors and students alike to accurately detect plagiarism in essay writings. Guess the technology behind such tools? Artificial intelligence, of course! AI-driven plagiarism detectors are useful for people vetting websites before buying them.

While most plagiarism tools won’t reveal exactly how they detect plagiarism, research has it that ML can indeed be used to develop a plagiarism detector. For instance, ML can be leveraged to scour and compare sources that are not even on the internet, such as sources in old databases or those in foreign languages.

3. AI Enables Swift and Accurate Google Search Results

When was the last time you searched on Google for a product you need or perhaps an answer to a question? Maybe 15 minutes ago — or less?

Here’s something you might not know: search engines can’t scan the entire internet without the help of AI spiders. It’s that simple. From delivering more relevant search results to enabling concise ranking of content and materials, AI really lends itself to Google Search.

Remember those annoying ads that you never seem to shake off? Yep, even those are powered by AI, and are customized to your unique buying needs based on your search history.

4. AI Simplifies Elderly Caregiving

For many old folks, everyday tasks can be a struggle. Many have to call upon family members or hire outside help to get simple chores done. Thankfully, advances in AI technology mean that elderly people who don’t want to leave their homes can finally have some much-needed reprieve.

Say hello to in-home, elder care robots.

Unlike humans, AI-enabled in-home care robots never tire or get frustrated and are reliable and consistent 24/7. And with the percentage of older people (65 or older) expected to rise by a whopping 26% by 2050, according to government data, there’s never been a better time to adopt elder care robots than now.

In 2019, TIME published an article about an Irish senior care robot called Stevie that could seamlessly engage with users both socially and physically. It’s 2022! You bet there are more robots like Stevie in existence.

AI Impacts Everyday Life

These real-life processes and examples are just an example of what AI can do. It’s up to you to embrace this wonderful technology and see how your life can become easier and your business more profitable with its use.

The post Artificial Intelligence Is Influencing Everyday Lives for the Better appeared first on SmartData Collective.

Source : SmartData Collective Read More

12 Jobs That Are Booming in the Age of Big Data

12 Jobs That Are Booming in the Age of Big Data

Did you know that big data consumption increased 5,000% between 2010 and 2020? This should come as no surprise. It is going to continue to change the workforce in the process.

Big data technology is changing countless aspects of our lives. A growing number of careers are predicated on the use of data analytics, AI and similar technologies.

It is important to be aware of the changes brought on by developments in big data. This will help you stay up to date with the necessary skills and project the future demand for your profession.

What Fields Are Growing the Fastest as the Result of Advances in Big Data

Analysts predict global technologization trends for the following decade. Thus, IT, automation, and robotics will affect all areas of life. Along with the intensive development of IT and eco-design, the service sector will also grow.

The essence of medical experts’ and teachers’ work will change. Yet, these professions will remain on the market. There is a phrase of unknown origin that states that people will always study, eat, and get sick. Thus, the specialties in these spheres have been, are, and will always be in high demand.

The pandemic has already contributed to the changes. It started the transition of education and entertainment into the virtual environment. This trend will continue in the future. In addition, the job market changes constantly. Some occupations that used to be in demand five years ago appear unnecessary and low-paid today. To avoid getting into the list of career-field outcasts, take a look at predictions from Skillhub and adjust your plans accordingly.

So, here is the list of popular jobs that will remain so in the near future.

Bioengineer

This expert deals with altering the properties of living organisms using the main principles of medicine and biology. One of the best examples is biopharmacy. It involves the development of advanced drugs and dietary supplements for therapeutic purposes. Also, bioengineers need to know bioinformatics – the creation of programs collecting and analyzing data in biology.

Big data is playing a key role in the growth of the bioengineering field. A team of researchers from the University of Wisconsin discussed this in their study Big Data Application in Biomedical Research and Health Care: A Literature Review.

Genetic Engineer

The world market’s need for genetic engineering is proliferating. As a result, this sphere has a high demand for specialists in the human genome and hereditary diseases and those involved in creating new crops. The market is growing due to advances in big data for the same reasons the University of Wisconsin study attributed to the growth in bioengineering.

Food Technologist

Foodtech is one of the main current business trends. The development of new food products – artificial meat, dairy substitutes, gluten-free confectionery – direct consequences of the growing demand for healthy food and the increase in population. Specialists in this area get a pretty high salary and ensure that the job remains relevant for decades.

Artificial intelligence is playing a crucial role in the growth of Foodtech. More researchers are using predictive analytics and AI to anticipate the outcomes of various food engineering processes, so big data will be even more important to this field in the future.

Robotic Engineer

Robotics specialists are involved in the design, software development, and maintenance of robots. As robots enter our everyday lives, specialists dealing with them will get more work to do. There is already a division into areas: industrial robots, home, medical, gaming, and so on. It is a promising position for those skilled in mechanics, electronics, data analytics and ML.

Programmer

Experience in coding is a skill that the world’s largest companies are willing to pay quite well. As a result, there is a real hunt for skilled programmers. A great demand for mobile application developers for iOS and Android is expected in the next ten years by 2025. Many programmers specialize in data science these days, which is playing a role in the growth of programming jobs.

3D Printing Designer

Almost anything can be printed on 3D printers – from jewels and artificial organs to residential buildings and uncrewed aircraft. With its help, designers create both prototypes and ready-to-use things. A skilled professional has to know both the field for which the object is created and 3D modeling. Low-level programming skills can also be helpful to develop software for printers.

Data analytics is attributed to many changes in the 3-D printing space. One of the biggest benefits of big data is that it can help 3-D printing experts uncover flaws in various designs before taking them to the printer. They can also use data analytics to conduct better research when conceptualizing their designs.

AI Expert

Today, artificial intelligence improves user experiences with programs and websites. A professional in neural networks uses machine learning as a primary instrument. With their help, AI learns to

recognize objects; give meaningful answers to questions; reach decisions that traditional computer algorithms cannot make.

AI technologies are already used in many business niches – from processing loan applications in banks and resume screening to voice assistants in GPS navigation systems. In the future, the demand for AI training specialists will only grow.

Internet-of-Things Development Engineer

Specialists in this area are engaged in software development, machine learning, and analysis of data obtained from various devices. The Internet of Things enters all sizes, even such unexpected ones as street lighting control systems. The improvement of control systems for smart homes, cities, roads and uncrewed devices also belong to these specialists’ areas of expertise.

Big Data Analyst

They deal with analyzing significant amounts of unstructured information to identify connections that are not obvious at first glance. Based on the research, specialists in other spheres develop new substances and technologies and study behavioral patterns. It is better to use top cv writing services online for landing a fast-developing job. 

Logistics Expert

It’s a very interesting profession in high demand. It revolves around the organization of the transportation of goods. A logistics specialist needs to develop the cheapest way to deliver products.

The surge in online sales amid the outburst of pandemics also boosted demand for logistics professionals. As a result, the best career prospects await logistics experts in large retail chains and transport companies.

Internet Marketing Manager

People spend most of their days on the web reading news, updating Instagram feeds, or chatting. As a result, businesses have to recoup and deliver information through relevant channels. This demands an Internet marketing expert.

A professional in general marketing can get certified in digital marketing and switch to Internet platforms. However, the one who knows how to construct a sales funnel and build online communication with a client will always be in demand.

Big data technology is more important than ever in the field of Internet marketing. This is especially true as more marketing strategies focus on highly targeted and personalized marketing, which requires detailed customer data.

Internet Advertising Expert

While online marketing deals with the general idea, someone still has to set advertising campaigns. This is the thing online advertising specialists deal with. Having learned how to work with targeted, contextual, or any other advertising, one will always find work as a full-time or a freelance expert.

Change Is the Only Constant

Technologies get into all areas of life. Current statistics suggest an increase in demand for technical specialists in the IT-sphere. In addition, population growth and eco-friendly lifestyle trends raise demand for bio- and genetic engineers. Some of the changes and trends are already quite visible. Some are yet to come into view.


Provided by Skillhub

The post 12 Jobs That Are Booming in the Age of Big Data appeared first on SmartData Collective.

Source : SmartData Collective Read More

How Data Analytics Can Change the Way In-House Legal Departments Do Business 

How Data Analytics Can Change the Way In-House Legal Departments Do Business 

Legal analytics is an evolving discipline that is changing the future of the legal profession. Law firms are expected to spend over $9 billion on legal analytics technology by 2028.

But what is legal analytics? How will it change the legal profession?

Last year, we published an article on the ways that big law and big data are intersecting. We have had time to observe some major developments of legal analytics over the last year.

One buzzing question in in-house legal departments is, “How are we doing? This simple question is fraught with multiple meanings; it could mean how are we doing in terms of financial health, tuning contracts for the business, compliance efforts, or litigation? 

Usually, the legal space lacked the data to measure appropriately and report its findings. Increasingly, though, brands and businesses of all sizes expect their legal representatives to leverage and report out – data the same way as the rest of the company. As a result, big law firms have implemented data analytics a top-of-mind priority for in-house attorneys.

What is Legal Analytics?

Legal analytics is the process of implementing data into your decision-making on topics affecting legal forms and attorneys, like legal strategy, a matter of forecasting, and resource management. When used diligently, legal analytics provides a cut-throat advantage by providing outstanding transparency and insight into in-house departments, council members, and decision-makers. 

In general, data in the legal sectors fall into three categories: 

Individual data Internal law firm data Legal industry data 

The difference between the internal law firm and individual data is that the second is related to your existing and potential clients’ online activities to your site, which is usually sourced from places like Google Analytics, cookies, and email campaigns. Internal data is sourced from your legal department’s daily activities like time tracking or billing rates. On the flip side, industry data streams from third-party research groups to determine legal industry trends. 

A few years back, it was almost impossible to obtain any valuable insights from the massive amount of legal data available.

But with the latest technological developments like language processing, AI, machine learnings, legal professionals now have the tools to make data-driven decisions when formulating case strategies, estimating case results, and even gaining new clients. 

Computers can perform tasks at unmatchable speed, saving valuable time and money for your legal departments and legal professionals.

Legal research

Most legal professionals and attorneys are no strangers to legal research and analytics. Spotting the right statuses or precedents could give you the edge in court. This is one of the most important ways that big data is changing the legal profession.

But performing innovative, in-depth legal research that’s accurate can be a time-consuming process. Data analytics in the legal environment can help you find relevant cases in every practice are without having to sort through all of them individually. 

Delivering electronic discovery

Traditionally, maintaining, collecting, reviewing, and exchanging case-related information was a long, tedious process for legal professionals. However, with the e-Discovery analytics software, switching electronic data between parties during litigation and investigations is a whole lot easier.

With e-Discovery legal analytics tools, you can filter documents by data range instead of delving through mountains of documents or focus on only those containing the exact keywords. This removes the need for manual work. 

Predictive analytics

Foreseeing the future is no longer impossible for legal professionals and attorneys. From estimating how long a case will last to the potential of winning, predictive analytics can help legal leaders offer specific trends insights, correlations, and irregularities to form a case, evaluate suspects, and devise litigation strategies. 

Legal departments can also rely on data analytics to facilitate their hiring processes, assess potential candidates, and create the best team composition. Valuable data can also help legal professionals cover all their needs – from deciding if they need partners, external counsel, or freelancers for projects.  

Predictive analytics enable leaders to make more informed decisions.

We’re all acquainted with the stress of having to search through stacks of documents and rows of spreadsheet data to find that number you need. Yet, even Google-like searches within an organization system will keep professionals spending a normal of 23% of their time searching for information. 

But things have changed, and seamless software dashboards give critical team members real-time access to the most recent data. Data visualization methods, such as graphs and charts, reveal trends and insights in an instant. 

Improved Insurance Claim Processing

Often, a claim that initially seems simple can end up costing way more than anticipated. That’s a great challenger for insurance agencies. Deciding how many resources to assign to the processing and investigation of a claim to reduce the total cost of the insurer is not a piece of cake. 

Legal data analytics software can understand claims and predict when they will become complex. A new genre of analytics that combines the power of analytics with users’ domain knowledge can help legal professionals better understand complex claims’ characteristics. According to AccidentClaims.co.uk, taking simple inputs to define a behavior present in the data will tell legal professionals: 

Which features in the data are the most useful to predict which claims will be complex Present them with different groups of claims that are often complex, Create a model to allow legal representants to predict which future claims will be complex

Increased Marketing Potential

Legal marketing analytics can also help legal professionals to comprehend their ideal client. Their approach stands out compared to their competition, so they can build a more comprehensive marketing strategy that echoes their audience. 

What’s more, legal analytics can also help predict where your prospects are coming from, so you will know the best networking and social media platforms to spend your marketing effort and budget on. 

Accelerate Your Firm’s Growth Potential

Legal data insights provide a straightforward way to identify trends and patterns that pinpoint when and where business growth occurs and where concentrated efforts will improve additional potential. More intelligent data insights make it easier to scale your financial potential, reduce costs and gain more clients through targeted business development efforts.

By investing in legal data analytics for your legal professionals, your firm can seize previously undiscovered opportunities and strengthen its growth potential.

The post How Data Analytics Can Change the Way In-House Legal Departments Do Business  appeared first on SmartData Collective.

Source : SmartData Collective Read More

Editing Guide for AI-Driven YouTube Video Creators

Editing Guide for AI-Driven YouTube Video Creators

There are a lot of benefits of using artificial intelligence in 2022. One of the biggest reasons that many people use AI is to improve their marketing strategies. A recent survey found that 64% of marketers reported that data-driven marketing strategies are more important than ever. One of the biggest reasons big data is so useful is that it helps supplement AI technology.

There are a growing number of ways for marketers to utilize AI in their branding strategies. One of them is with YouTube.

A growing number of companies are turning to YouTube to maximize the ROI of their marketing efforts. In order to get the most value out of their YouTube marketing strategy, they will need to use the best AI-driven video editing tools.

Using Artificial Intelligence to Improve Your YouTube Video Editing Process

As a video content creator, the most worrisome part of production is editing your footage together because you may not have access to sophisticated editing software. But fortunately, YouTube offers a web-based video editor that is free through their platform. Of course, some features are limited but it’s enough to get started so that, at the very least, you can start building your community and your viewers can stay engaged and interested in watching more.

Artificial intelligence technology has made this much easier. Towards Data Science has an article on this topic. Author Dmytro Nikolaiev points out that the process is a lot easier if you have created a Python script that can use automatic and automated data to automate the video editing process.

There are a lot of AI and data-driven video editing tools. Some of them that have been shared by Analytics India Magazine include:

Adobe Premier Pro CCQuickStoriesMagistoLumen5

AKSHAYA ASOKAN points out that this makes the process much easier.

“In the recent past, we have come across AI-enabled smart cameras that can click the right picture and even identify the people in the visuals. Off lately, however, its application in video editing is becoming more common. From scanning the text script to understand the plot of the video to matching the visuals as per the script, AI is now automating workflow for video editors and are making their work easier.”

Here, you’re going to learn the most important aspects that you should keep in mind while editing your content for Youtube and how to go about with Youtube’s video editing tools. 

How to Edit Your Videos for YouTube

Editing videos can be pretty hard work and is often the most time-consuming part of the process. This is why it’s important to make sure your videos are helping you achieve your goals – whether that involves educating or entertaining your audience.

You will be able to automate many of the processes with AI, but you have to understand the fundamentals first. To help with this, here are some top tips for editing your videos in post-production:

Create Stories That Captivate

Videos are books that portray a story. Just as we read books, viewers want to watch videos with an overarching storyline; they don’t want to be left scratching their heads at its conclusion, wondering where on earth it all came from. Therefore, your videos must follow a certain flow and the fundamental elements of storytelling; which is as simple as having a beginning, middle, and end. 

To ensure that viewers won’t get lost while watching your videos, you should make sure they have a clear sense of what the story is and how each segment you share relates to it. Before the editing begins and you start putting your footage together, prepare yourself with a short outline of the topics you will cover in your video and try sticking to it. 

Enhance Engagement Through Elements

High-quality, engaging visual and auditory elements will add to the quality of your storyline and will keep your audience engaged. Use an online YouTube video editor tool and check for images, texts, fun animations, or music that are appropriate for your story. Using these can give your video an extra something that will make them memorable. However, always keep in mind what you want to achieve with them and how they will affect your overall theme of the video.

AI technology can help you create a number of new elements that will bolster engagement. Tools like Canva and Movie Maker have a lot of sophisticated AI features that can improve this process.

Create a Baseline Video Editing Style

To streamline your editing process and to create content that keeps your target audience engaged, it is always recommended to develop a baseline video editing style. This means that you would be following a specific style of editing your videos, using the same thumbnail style, music, transitions, and any other added elements that you’ll keep using for your upcoming videos. 

Your audience will most definitely recognize your consistent style and enjoy watching more of your content instead of being confused by the unfamiliarity with each video they watch. Your brand will also gain more recognition because it will project an image of stability and dependability! Moreover, you’ll spend less time trying to find new elements every time you edit because you won’t be scrambling around to find unique bits and pieces. 

For instance, you may give a snippet of the most interesting section in your video at the beginning to build up excitement for the subject you are about to discuss. Keeping your visual style similar for every content piece you put out there will give you a consistent framework while keeping your audience engaged.

Simple is Smart

Although it can be tempting to do all you can to captivate your audience, it’s also important to keep it simple and not overwhelm your viewers. Graphics and music which are too loud or flashy can distract your audience from the content that you’re providing them with. That would simply render it useless in a sea of confusion and undue attention-grabbing. Find that perfect balance between original video content and other add-on elements. 

This is one of reasons AI can be very effective. You can use AI to expedite editing and scale production, which is easier when you don’t have to create highly complex videos.

Take a Break and Come Back

If you sit at editing your video for, say, 5 hours straight – all focused and squeezing your creative juices, you’re bound to feel stuck. You may even end up feeling that your work is bad before it has even had a chance. But, when you step away for a few hours (or even days), you would be coming back to the table with a renewed sense and a fresh pair of eyes.

Video editing can end up being a challenge. Particularly if you have a chunk of footage that you’re still unsure what it should say or do. To improve things and make sure that your work is up to par with the rest of the competition out there, we recommend taking a step back from time to time.

Editing Videos on YouTube

The YouTube video editor is a basic tool that anyone new to the game can easily use. Here’s how you can go about this online editor so that you can make your video ready for the world to see. 

Step 1: Sign in to your YouTube StudioStep 2: Click on “Content” on the left sidebarStep 3: Click on “Uploads”Here, you will be able to see the videos that you can edit. Click on the video you want to edit. Step 4: Click on the “Editor” option on the left sidebarStep 5: Trim the Beginning and the End

To trim the beginning and the end part of your video, select “Trim” which would be available next to the video’s timer (above the timeline panel). A blue box would appear in your video timeline panel. Drag it and position it on parts that you’d like to chop off. 

Step 6: Trim the Middle

If you want to trim parts of your video in the middle, click “Trim.” On your timeline panel, click on the beginning of the clip that you’d like to remove and select “Split.” Now, go to the end of the same clip and repeat the action. Finally, this clip can be removed or trimmed as per your requirement without affecting the entire video. 

Step 7: Click on “Preview” to View the EditStep 8: Save your Video

If you’re satisfied with your work, click on “Save” on the top right corner of the screen.

Step 9: Add an End-Screen

If you want your viewers to be able to watch related or other videos of yours, click on the “End-Screen” icon at the left corner of your timeline panel. You would be able to apply an end-screen template by clicking on “Apply Template” in the pop-up menu. You may also choose other options that you’d like to show your viewers such as Video, Playlist, and Subscribe.

And You’re Ready!

Editing is indispensable, and it’s always a slow process, so don’t get impatient. Like any profession, the editing process takes time and practice to create a strong foundation of fundamentals that define your identity as a creator. 

Experiment with what works best for you and your content, as this directly reflects on your work. Come up with a consistent editing strategy to follow and make sure to stay on track to keep your audiences engaged!

The post Editing Guide for AI-Driven YouTube Video Creators appeared first on SmartData Collective.

Source : SmartData Collective Read More

Optimize your applications using Google Vertex AI Vizier

Optimize your applications using Google Vertex AI Vizier

Businesses around the globe are continuing to benefit from innovations in Artificial Intelligence (AI) and Machine Learning (ML). At F5, we are using AI/MI in meaningful ways to improve data security, fraud detection, bot attack prevention and more. While the benefits of AI/ML for these business processes are well articulated, at F5, we also use AI/ML to optimize our software applications. 

Using AI/ML for better software engineering is still in its early days. We are seeing use cases around AI assisted code completion, auto-code generation for no-code/low-code platforms but we are not seeing broad usage of AI/ML in optimizing the software application architecture itself. In this blog, we will demonstrate workload optimization of a data pipeline using black-box optimization with Google’s Vertex AI Vizier.

Performance Optimization  

Today, software optimization is an iterative and mostly manual process where profilers are used to identify the performance bottlenecks in software code. Profilers measure the software performance and generate reports that developers can review and further optimize the code. The drawback of this manual approach is that the optimization depends on developer’s experience and hence is very subjective. It is slow, non-exhaustive, error prone and susceptible to human bias. The distributed nature of cloud native applications further complicates the manual optimization process.

An under-utilized and more  global approach is another type of performance engineering that relies on performance  experiments and black-box optimization algorithms. More specifically, we aim to optimize the operational cost of a complex system with many parameters. Other experiment-based performance optimization techniques exist, such as causal profiling, but are outside the scope  of this post. 

As illustrated in Figure 1, the process to optimize the performance is iterative and automated. A succession of controlled trials is performed on a system to study the value of a cost function characterizing the system to be optimized. New candidate parameters are  generated, and more trials are performed until this results in too little improvement to be  worthwhile. More details on this process later in this post.

Figure 1: Black-box optimization – Iterative experiments to arrive at optimal output as a cost function

What is the problem 

Let’s first set the stage – partly inspired by our experience, partly fictitious for the purpose of  this discussion.  

Our objective is to build an efficient way to get data from PubSub to BigQuery. Google Cloud offers a fully managed data processing service, Dataflow for executing a wide variety of data processing patterns which we use for multiple other realtime streaming needs. We opted to leverage a simplified custom stream processor for this use case for processing and transformations to benefit from the ‘columnar’ orientation of BQ — sort of ‘E(t)LT’ model.  

The setup for our study is illustrated in more detail in Figure 2. The notebook in the central position plays the role of orchestrator for the study of the ‘system under optimization’.  The main objectives (and components involved) are: 

Reproducibility: in addition to an automated process, a pub/sub snapshot is used to  initialize a subscription specifically created to feed the stream processor to reproduce  the same conditions for each experiment. 

Scalability: the  Vertex AI Workbench implements a set of automated procedures used to run  multiple experiments in parallel with different input parameters to speed up the overall  optimization process.  

Debuggability: for every experiment the study and trial ids are systematically injected as  labels for each log and metric produced by the stream processor. In this way, we can easily isolate, analyze, and understand the reasons for a failed experiment or one with  surprising results.

Figure 2: High level architecture for conducting the experiments

To move the data from PubSub to BigQuery efficiently, we designed and developed some code and now want to refine it to be as efficient as possible. We have a program, and we want to  optimize based on performance metrics that are easy to capture from running it. Our question  now is how do we select the best variant?

Not too surprisingly, this is an optimization problem: the world is full of them! Essentially, these  problems are all about optimizing (minimizing or maximizing) an objective function under some  constraints and finding the minima or maxima where this happens. Because of their widespread  applicability, this is a domain that has been studied extensively. 

The form is typically:

– read as we want the x of a certain domain X that minimizes a cost function f. Since it is a minimization problem here, such x are called minima. Minima don’t necessarily exist and when they do are not necessarily unique.  Not all optimization problems are equal: continuous and linear programming is ‘easy’, convex  optimization is still relatively easy, combinatorial optimization is harder… and this is assuming we can describe the objective function we want to optimize — even partially as  with being able to compute gradients.  

In our case, the objective function is some performance (still TBD at this point) of some  program in some execution environment. This is far from f(x)=x2: we have no analytical  expression for our program performance, no derivatives, no guarantee that the function is  convex, the evaluation is costly, and the observation can be noisy. This type of optimization is  called ‘black-box optimization’ for the reason that we cannot describe it in simple mathematical  terms our objective function. Nonetheless we are very much interested in finding the  parameters that deliver the best result. 

Let’s now frame our situation as a concrete optimization problem before we introduce further  black-box optimization and some tools as we are looking for a way to automate solving this  type of problems rather than doing it manually — ‘time is money’ as they say.  

Framing as an optimization problem

Our problem has many moving parts but not all have the same nature.  

Objective 

First comes the objective. In our case, we want to minimize the cost per byte of moving data  from PubSub to BigQuery. Assuming that the system scales linearly in the domain we are interested in, the cost per byte processed is independent of the number of instances and allows to extrapolate precisely the cost to reach a defined throughput. 

How do we get there?  

We run our program on a significant and known volume of data in a specified execution  environment — think specific machine type, location, and program configuration —, measure how  long it takes to process it and calculate the cost of the resources — named `cost_dollar` below. This is our cost function f.  

As mentioned earlier, there is no simple mathematical expression to define the cost function of  our system and evaluating it actually involves running a program and is ‘costly’ to evaluate.  

Parameter space 

Our system has numerous knobs: the program has many configuration parameters  corresponding to alternative ways of doing things we want to explore and sizing parameters  such as different queue size or number of workers. The execution environment defines even  more parameters: VM configuration, machine type, OS image, location, …  In general, the number of parameters can vary wildly — for this scenario, we have a dozen.  

In the end, our parameter space is described by Table 1 which for each  `parameter_id` gives the type of value (integer, discrete or categorical).

The objective has been identified, we know how to evaluate it for some given assignment of a  collection of identified parameters and we have defined the domain of these parameters.  That’s what we need to allow us to do some black-box optimization. 

Approach 

Back to the black-box optimization: we already stated this is a problem dealing with  minimization/maximization of a function for which we have no expression, we can still evaluate  it! We just need to run an experiment and determine the cost. 

The issue is running the experiment has a cost and given the parameter space, exploring them  all is rapidly not a viable option. Assuming you pick only 3 values for each of the 12-ish  parameters: 312=531,441 — it’s large already. This method of exploring systematically all  the combinations generated from a subset of each parameter taken individually is called grid  search. 

Instead, we use a form of surrogate optimization: In case like this one where there is no  convenient representation of our objective function, it can be beneficial to introduce a  surrogate function with better properties that models the actual function. Certainly, instead of  one problem: minimizing our cost function, we have two: fitting a function on our problem and  minimizing it. But we gained a recipe to move forward: fit a model on the observations and use  this model to help choose a promising candidate for which we need to run an experiment. Once we have the result of the experiment, the model can be refined and new candidates can be  generated, until marginal improvements are not worth the effort.  

Google Cloud Vertex AI Vizier offers  this type of optimization as a service. If you want to read more about what is behind — spoiler: it  relies on Gaussian Process (GP) optimization, check this publication for a complete description:  Google Vizier: A Service for Black-Box optimization.  

We performed 148 different experiments with different combinations of input parameters. What have we learned?  

Results of our study

The point of this discussion is not to detail precisely what parameters we used to get the best  cost – this is not transferable knowledge as your program, setup, and pretty much everything  will be different. But to give an idea of the potential of the method: in our case, with 148 runs,  our cost function went from $0.0780/run with our initial guessed  configuration down to $0.0443/run with the best parameters — a reduction of the cost of 43%.  Unsurprisingly, the `machine_type` parameter plays a major role here, but even with the same  machine type as the one offering the best results, the (explored) portion of our cost function  ranges between $0.0443/run and $0.0531/run – a variation of 16%.  

The most promising runs are represented in Figure 3. All axes but the last 2 correspond to parameters. The last two, respectively, the objective `cost_dollar`, and represent whether the run completed or not. Lines represent the runs and connect the values for each axis together  corresponding to them.

To conclude on the study part, this helped us uncover substantial cost improvement with  almost no intervention from our end. Let’s explore that aspect more in the next section.

Learnings on the method 

One of the main advantages of this method is that: provided you have been through the initial  effort of setting things up suitably, it can run on its own and require little to no human  intervention.  

Black-box optimization assumes the evaluation of f(x) only depends on x not on what else is going on at the same time. 

We don’t want to see interactions between the different evaluations of f(x). 

One of the main applications of Vizier is deep learning model hyper-parameter optimization.  The training and evaluation are essentially devoid of side effects — cost aside, but we already  said black-box optimization methods assume the evaluation is costly and are designed to  reduce the number of runs needed to find the optimal parameters. Our scenario has definitively  side-effects: it is moving data from one place to another.  

So, if we ensure all side-effects are removed from our performance experiment, life should  be easy on us. Black-box optimization methods apply, and Vizier in particular can be used. This can be achieved by wrapping the execution of our scenario in some logic to setup and tear  down an isolated environment: making this whole new system essentially without side-effect. 

Couple of lessons on running these kinds of tests we think worth highlighting: 

Parameterize everything even if there is a single value at first: if another value becomes  necessary it is easy to add, worst case: values are recorded along with your data making  it easier to compare things between different experiments if needed. 

Isolation between runs and other things: if it is not parameterized and have an impact  on the objective, this will make the measurements noisier and make it harder for the  optimization process to be decisive about where to explore next. 

Isolation between concurrent runs: so can run multiple experiments at once. 

Robust runs: not all combinations of parameters are feasible, and Vizier supports  reporting them as so. 

Enough runs: Vizier leverages the result of previous runs to decide what to explore next  and you can request for a number of experiments to run at once – without having to  provide the measurement yet. This is useful to start running experiments in parallel but  in our experience this is also useful to make sure initially you have a broad coverage or  the parameter space before the exploration starts to try to pinpoint local extrema. For  example, in the set of runs we described earlier in the post, ‘n2-highcpu-4’ didn’t get  tried until run 107.  

Tools exist today: Vizier is one example available as a service. There are many python  libraries available too to do black-box optimization.

Definitely something to have in your toolbox if you don’t want to spend hours with knobs and  you prefer a machine doing that. 

Conclusion and next steps 

Black-box optimization is unavoidable for ML hyper-parameter tuning. Google Vertex AI Vizier is a black-box optimization service with a wider range of applications. We believe it is also a great tool for the engineering of complex systems that are characterized by many parameters with essentially unknown or difficult to describe interactions. For small systems, manual and/or systematic exploration of the parameters might be possible, but the point of this post is that it can be automated!

Optimizing performance is a recurring challenge as everything keeps changing and new options  and/or new usage patterns appear. 

The setup presented in this post is relatively simple and very static. There are natural  extensions of this setup to continuous online optimization that are worth exploring from a software  engineering perspective like multi-armed bandits. 

What if the future of application optimization was already here but not very evenly distributed – to paraphrase William Gibson? 

Think this is cool? F5 AI & Data group hires

References 

Google Vizier: A Service for Black-Box OptimizationGoogle Cloud Vertex AI Vizier

Source : Data Analytics Read More

Securing Venture Capital for Your New Cloud Startup

Securing Venture Capital for Your New Cloud Startup

Are you trying to grow or launch a cloud technology startup? You won’t be able to do so without a significant amount of capital. Recent news reports on Infracost can give you some insights on the cost of launching a cloud startup. This company raised over $2.2 million in funding to grow its operations. Of course, they had to spend a lot more money to start their business in the first place.

You can’t underestimate the cost of launching and growing a cloud-based business. You are going to need to recognize the barriers to entry in this industry and make sure that you have access to the capital needed to cover these costs.

You might want to consider using venture capital to raise the funds needed to grow your cloud business. There are a lot of advantages of VC funding for technology startups.

Getting Venture Capital for Your Cloud Startup

Venture capital is a term that you are probably familiar with. After all, many of the most successful and best-known companies in the world, from Uber to Facebook to Airbnb, are backed by venture capital. More cloud companies are also taking advantage of venture capital as well.

You might not know what venture capital funding is, how it works, and who it is suitable for. However, if you would like to raise funds for your cloud startup, the following few paragraphs should help you know whether venture capital is something you should consider, especially if you are a start-up.

What Is Venture Capital (VC)?

Venture capital refers to a type of equity financing that involves investing capital and getting equity in return, usually in the form of a minority stake in a company that looks poised for significant growth. A venture capitalist is an individual that makes these investments.

Venture capital is technically a form of private equity (PE). However, private equity is usually used to refer to investments made into more mature businesses by private equity firms.

Venture capital is especially attractive for high-tech startups, such as cloud businesses. They have strong growth opportunities, which makes them appealing to VCs.

How Does Venture Capital Work?

Unlike angel investors who use their own money for investing, venture capitalists typically work for venture capital firms that raise funds from outside investors. Those investors, referred to as limited partners, may include family offices, high net worth individuals, or even institutional investors such as insurance companies and pension funds.

Venture capitalists use the capital raised to invest in businesses with a high potential for growth or have already demonstrated impressive growth. Various stages of venture capital funding reflect multiple phases of the development of a company. As start-ups grow, they usually go through these stages and raise several rounds of venture capital funding.

Venture capital firms sometimes have a diversified approach involving investing in companies at different stages of the business lifecycle, while others focus on certain stages. For instance, seed investors aid early-stage start-ups to get off the ground, while late-stage investors allow established companies to keep expanding. Venture capital firms also specialize in making investments within a particular industry or industry vertical.

Venture capital financing allows businesses to obtain large amounts of capital. Furthermore, the right investor helps add value to the company by providing experience, skills, and connections. Investors will often want to join the company’s board as either a board advisor or official board member as part of a venture capital deal. That way, they are involved in the company’s strategic or operational decisions and can play a crucial role in ensuring success.

As a cloud company, you need to show that you will be able to meet your growth projections and be profitable in the long-term. You might have an easier time if you can show that your business will bring valuable benefits to small businesses, since they are such a rapidly growing share of the market.

Is Venture Capital Suitable for Your Business?

Venture capital firms are best known for funding technology companies due to their tendency to scale quickly. Still, they are not limited to this industry since they also invest in non-tech businesses. One common denominator in all venture-backed businesses is that they are oriented towards significant and rapid growth. Venture capital is best suited to entrepreneurs with big ambitions that have no interest in retaining complete control over the company as it grows.

Here are some of the things that investors usually look for when they evaluate a business:

– Solves a muscular customer pain: The service or product should not just be ‘nice-to-have’; it needs to create actual value for customers and solve a problem.

– Scalability: Venture capitalists look for companies with the potential to increase sales and grow efficiently and cost-effectively.

Exit Opportunities: There must be a potential way for venture capital firms to exit, realize returns, and get the money back to their investors.

Funding for Cloud Technology Companies

We are a venture capital firm that invests in businesses looking to either develop or exploit cloud technology. We work with companies from the start-up stage through the exit and offer entry equity investment ranging between $68,000 and $2.7 million, up to a maximum of about $6.8 million for each round. There are many benefits of cloud computing, which makes companies in this sector attractive to venture capitalists.

We aim to give the companies that we support a competitive edge and create value in the long run. We can also co-invest alongside various other funding sources such as venture capital firms.

The post Securing Venture Capital for Your New Cloud Startup appeared first on SmartData Collective.

Source : SmartData Collective Read More