Archives November 2022

Versatility of Using Machine Learning for Video Editing

Versatility of Using Machine Learning for Video Editing

Video editing has changed significantly over the years. During the first half of the 20th Century, people had to splice film manually, which could create all kinds of problems.

Machine learning technology is one of the new technologies that has drastically changed the state of video editing. This technology uses deep neural networks to automate the process. Machine learning also makes it easier for video editors to create more engaging videos by utilizing unique features that were not previously possible. Analytics India Magazine reports that machine learning is only getting better at video editing.

Machine learning has also made it easier for amateur video editors to learn this skill. Keep reading to learn more.

Machine Learning Creates Massive Opportunities for Video Editors

Unfortunately, the complexity of professional editing software makes it hard for people to edit social media videos or marketing videos. Like many beginners, you may find it too struggled to understand editing software and had a ton of unedited video material sitting on your hard drives mainly because you are too intimidated to learn how to edit stunning videos.

As a result, a growing number of video marketers are utilizing tools that take advantage of machine learning to make the process simpler. We have talked about some of the great free video editing tools that use AI in the past. However, we want to focus on a cutting-edge tool called Filmora, which uses remarkable AI algorithms.

Tools like Wondershare Filmora find unique ways to use AI to create higher quality video content. Wondershare Filmora has an AI Portrait tool that enables users to easily remove video backgrounds without the use of a green screen.

The Stanford Engineering Staff reports that AI is making major breakthroughs in video editing. They showed that Maneesh Agrawala, a renowned computer graphics expert, was able to type a few words on a screen and edit a video to look like a public figure was speaking.

I believe a good video editor is one, that has a user-friendly interface and a wide range of video pre-sets to choose from, for instance, Filmora Video editor. With enough practice, and consistency you can easily master editing cool videos and reels alike.

AI Creates Amazing Features for Video Editors Using Wondershare Filmora

With a self-explanatory interface that doesn’t need much expertise to operate, Wondershare Filmora is truly the best choice among the free editing software in the market, especially for YouTubers, freelancers and digital marketers. This tool has some amazing machine learning capabilities that make these processes easier than ever. Once I downloaded the free version from the website, it took me no more than 5 minutes to find out how to edit videos with it.

Very user-friendly; in keeping with my earlier comparison, it’s as simple to use as the popular photo-editing app Canva. There’s almost no learning curve; you can edit your videos using any premade templates, such as vlog, business, holiday or other settings. You can use its machine learning capabilities to edit videos quicker if you choose Instant Mode, as it automatically edits the video for you without needing any manual intervention.

Everything you need can be found on the top two toolbars after you open the app on your Windows or Mac computer. There are many tools at your disposal, including media, audio, titles, transitions, effects, components, and split-screen formatting. Each of these categories contains dozens of additional tools for enhancing your videos.

However, the vast majority of videos may just need to be chopped in the most basic form. To do this, simply drag-n-drop your clips into the timeline, stop the playback at the desired frame, and then click the orange scissors.

Easy Drag and Drop

Yes, we’ve been here before, but now you can just drop and drag to rearrange the various elements of your video, be it music, titles, transitions, or effects. The tool also uses machine learning in its GUI interface to help determine where the user wants to drop new elements.

When it comes to video editing, Filmora is as easy as any video editor can get. In other words, there is easy learning curve to speak of and you can start editing videos like a professional right away!

AI Helps Editors Choose Appropriate Elements and Transitions

Helping move the story along, transitions and elements allow the video editors to convey their mood and tone to the viewers. Dissolve, cut away, fade, zoom — even these simple transitions let the video editors emphasize their message and tie their shots.

So, think about what you can do with the abundance of exclusive transitions and elements enclosed within Filmora video editor. Retain that using transitions does not mean using the flashiest or noticeable to tie the shots. Instead, it is the invisible transitions that make the difference and make out for an unseen cohesiveness.

Tools like this can also use AI to help suggest new elements that can be used to improve the quality of the video.

Audio Editing

Since music is critical for videos, Filmora provides a wide variety of audio that can serve as the video’s soundtrack.  Honestly, finding music for my videos was a tedious process before I started using Filmora, but now I can find what I need in a flash.

Fortunately, Filmora also uses machine learning to improve video editing capabilities. Clicking the second tab in the main menu will take you to the audio library, where you’ll find a plethora of snippets sorted by genres including rock, folk, electronic, and more.

Any media file can have its playback speed changed or even reversed with the click of a mouse. There is a range of speeds available, or you can also choose our desired one.

You could also choose the speech-to-text option to add your voice and talk about your product review, etc.

Duration

Filmora’s ‘Duration’ option in the timeline menu is another great addition; it lets you specify how long you’d like your film to be, and the program will automatically speed it up or slow it down to fit your mentioned specifications.

You can add your favorite settings to the favorite folder to have easy and quick access to them. Any audio files, titles, transitions, effects, etc. that you use regularly can be saved to a favorites folder inside that feature by just hovering over them.

Additional Options for Your Animation Videos with Filmora’s Advanced Editing Features:

Filmora’s keyframing can be used for fluid animated effects.

Raise the bar on your editing skills with simple, accessible object tracking or Motion Tracking.

Using speed ramping, you may adjust the pace of your video with more precise control over your keyframes, allowing you to produce striking cinematic effects.

Take advantage of more than one display to tell your tale in a novel way with split-screen mode.

Enhance your editing skills with the help of “Green Screen,” a feature that provides easy-to-use object tracking.

Masking and blending tools help you conceal unwanted details or remove shadows from your videos.

Use Boris FX and NewBlue FX plugin to add visual effects like particle effects to make your video more professional.

Improve face detection and remove video background by using AI portrait feature.

Instantly edit your photos by using the instant mode option. Just add all your project files and choose your favorite video template.

The extra functions offered by Wondershare, such as Text to Speech, and Wondershare Drive, are rather impressive. As the word suggests Speech-to-Text, allows you to transform your voice into text with just one click, sparing you the trouble of transcribing it manually.

Wondershare Drive allows you to save and share your project files, project templates, and output movies.

Pricing

When it comes to pricing, Filmora video editor is quite reasonable. It offers annual plan at just $49.99, a perpetual plan at just $79.99. You can renew or cancel your plan at any time.

The Effects & Plug-ins upgrade is available for free for 7 days and then renews at a discounted rate of $20.99 a month (43% off) until you cancel.

Unlike other expensive video editors, Wondershare Filmora is a professional-grade video editor that won’t break the bank at $50 a year for an average user.

Verdict

I have tried and tested several video editors my entire life and decided to stick with Filmora due to its friendly UI, cheaper price, and amazing editing options. I can`t stress enough how easy it has made my life and of my fellow YouTubers whom I suggested using the software.

Machine Learning Technology Has Led to Major Breakthroughs in Video Editing

Video editing is evolving quickly due to new advances in machine learning. As we stated before, the benefits clearly outweigh the risks. A growing number of video editors are using machine learning tools like Filmora to make these tasks faster and better than ever.

The post Versatility of Using Machine Learning for Video Editing appeared first on SmartData Collective.

Source : SmartData Collective Read More

Challenges of Hiring Dedicated Developers for AI Projects

Challenges of Hiring Dedicated Developers for AI Projects

AI technology is clearly changing our world for the better. One survey found that 86% of CEOs believe AI is essential to their offices. Around two-thirds of consumers believe that AI will improve safety with cars.

The global market for AI is over $328 billion and that figure is growing 20% a year. However, it can only keep growing if companies invest in it and hire the best programmers.

Developments in AI depend on companies that are willing to hire the best developers. Are you working on a new app or other project that relies heavily on AI technology? You are going to want to hire the best developers to back your project.

Unfortunately, this is easier said than done. The impact of the pandemic and the resulting talent shortages is beginning to force businesses to rely on outsourcing development services more often than in the past. Lonne Jaffe, Managing Director for Insight Partners, writes that the talent shortage has seriously hindered the development of AI technology. AI may be the future of technology, but only if we can hire great developers.

According to the latest reports, the global market size of IT outsourcing services is valued at $92.5 billion (according to the IT outsourcing industry research by Statista). When you have a startup idea and a clear product vision, hiring a team of dedicated developers is not as much of a challenge these days. In this article, we explore the most common reasons to consider outstaffing, including cost savings, a wider pool of qualified developers, resource optimization, and flexibility.

Challenges to Overcome When Hiring AI Developers Overseas

If you want to help develop new AI tools and apps, then you need to find a way to address the ongoing talent shortage. We will look into the challenges involved when searching for the right offshore outsourcing strategy. Let’s describe the main ones and consider some solutions to minimize them.

Time zones differences 

The world is much bigger than the country in which your company is located, which means that the personnel reserve is theoretically much larger. Because the talent pool is larger, if you choose software development from offshore vendors, your business will have access to a much larger number of candidates who boast a wider range of skills and more impressive cases in the portfolio. But working with a partner on the other side of the globe means that your team may work when you are asleep and vice versa. This can cause significant organizational challenges. For example, if you need any clarifications or want to contact your team, you cannot reach them before the start of their working day, which may cause significant delays in the project.

As an outsourcing company, we have encountered issues stemming from time zone differences and can share a few tips on how to handle them. The most important of which is to ensure you use the right tools to schedule meetings/calls, coordinate with other team members, and resolve other management issues. Using efficient time zone management tools can help offshore development services providers manage time zone differences better.

Communication and cultural gaps

In addition to time-zone challenges, you may encounter communication gaps. Of course, most developers today speak at least a little English, but if their proficiency level isn’t up to scratch, you may have severe issues with communication. Even if you do not have communication challenges, there can still exist issues stemming from the a cultural gap. People in different countries have different attitudes towards their work, authority, words, and actions. These peculiarities may interest a tourist, but cultural differences can cause serious inconvenience and conflict when it comes to business. Solving these challenges is not as easy as time zones, but they can be managed if the company hires the right project manager. Additionally, the interview stage is a great place to evaluate the soft skills of potential developers. 

Mismatched expectations 

Before starting cooperation, you should discuss the main stages of the work and be clear on all details in the contract. Your dedicated developers must understand precisely what you want and find the most effective ways to implement your idea. Business owners often outsource their web application development projects without fully understanding what they’re getting into. If you have only a vague idea of your project, it becomes quite difficult to establish precise requirements for others and near-impossible to assess whether the process is proceeding accordingly. The challenge is to ensure that you, as a customer, have a good understanding of your project needs, can answer all questions, and provide support and direction throughout the work process. The importance of communication between all parties should not be underestimated. Here are a few of the things that the outsourcing company should indicate: accurate profile of the target audience; the main functions of your application; technologies you want to use when developing applications; business goals of the organization; design details and examples of existing applications.

Quality control

Choosing a reliable partner that can create a quality product that fully meets your requirements and expectations is difficult. To address this, it’s important to start by checking the reputation and experience of the outsourcing company. In addition to portfolios, look at third-party sources of information about the company, and check reviews and ratings. Before choosing and starting work, it is necessary to form an image of the company. Checking similar past projects and selecting for companies that have worked in avariety of industries can helps to mitigate a lot of the risk. Careful study of the portfolio will help to understand whether the studio has cases suitable for your request, whether developers have the necessary hard skills, and whether lates tech trends are being utilized. Another good sign is if the company is ready at the initial stage to share detailed information on their assumed approach to the work process, indicate how many people will work on the job, as well as when the developers will start work. These are good indicators of competence, readiness and transparency. Additionally, it is critical to test and control your project during the development process. This way, you’ll ensure that your final product will meet expectations and function adequately. QA should be involved in the development from the outset otherwise, it is likely that many more errors will be encountered in the final product.

Security issues

One of the most dangerous and common challenges associated with outsourcing is around security. Offshoring introduces higher security risks compared to in-house hiring. Working with dedicated developers often involves the transfer of confidential data and can result in data leaking to hackers or scammers. However, the following recommendations could potentially work to address this challenge.  Firstly, you should have any partners sign a Non-disclosure Agreement (NDA) and ensure they are using reliable security protocols. Additionally, it is wise to avoid cheap companies and hire freelancers with no track record or reputation to uphold. As a result, you risk getting poor quality or complete failure of your project with no security guarantees. 

Manage Developers Effectively When Creating New AI Apps

Creating great AI applications requires great developers. Hiring dedicated developers as an effective way of approaching AI software development has become a growing trend across many industries. The most essential benefits of offshore development include reduced costs, more qualified developers, optimization of resources, and flexibility. However, businesses that work with offshore talent need to be ready to overcome challenges and issues by hiring dedicated developers, such as time zone differences, communication, and cultural gaps, mismatched expectations, quality control, and security issues. To overcome such challenges, the company should conduct a qualitative investigation into its outsourcing partner, conduct interviews with developers, and use appropriate management tools.

The post Challenges of Hiring Dedicated Developers for AI Projects appeared first on SmartData Collective.

Source : SmartData Collective Read More

Cloud and AI Technology Help USB Flash Drives Stay Relevant

Cloud and AI Technology Help USB Flash Drives Stay Relevant

Cloud technology and AI are rapidly changing the state of our technological landscape. Many old forms of technology have started to become obsolete, as a growing number of new tools utilizing these new forms of technology are making things easier.

However, the cloud and big data are also offering some benefits that help older forms of technology stay relevant. USB drives are an example. Despite the growing relevance of cloud technology, global customers still spend over $35 billion on USB devices and the market is growing over 9% a year. Ironically, newer technologies may be helping make USBs even more useful by making them easier to backup and making data recovery easier.

How Does the Cloud and AI Help Make USB Drives Stay Relevant?

USB drives may seem like they would not be as useful in a day where so much data can be easily stored on the cloud. However, they have some clear benefits.

For one thing, they are portable and can accessed without an Internet connection. As versatile as cloud technology is, you still need to be able to connect to the Internet to access data stored on it. This is one of the reasons that cloud technology hasn’t eliminated the need for data recovery software.

However, there are some downsides of USB devices. One of the biggest challenges is that data can be easily lost or corrupted.

Fortunately, advances in AI and cloud technology have created the best of both worlds for USB users. They can use cloud integration tools from companies like Kingston to backup their USB data on the cloud.

AI technology has also resolved some of the problems with USB technology. AI has helped restore lost data on various types of devices, including USB drives. Joanna Riley has talked about some of the benefits of using AI to improve data recovery efforts, such as predicting when data storage capacity will be exceeded or performing emergency restorations.

One of the benefits of using AI for data recovery on USB drives is that the tools can better identify lost data and reconstruct it more fully. This is making it easier to restore data more easily.

If you want to use AI to restore data from a USB, then you need to make sure that you use the right tools. Keep reading to learn more.

What Steps Should You Take to Restore Lost Data from USB Drives?

With an increased dependency on data storage devices, there is a significant increase in data loss situations. It can be due to an empty recycle bin, accidentally formatting devices, or a malware attack. So, today we bring a detailed but simple guide on how to recover permanently deleted photos from any storage devices.

As we stated in the past, AI and big data are changing the state of the data recovery industry. You need to know how to use the right AI tools to recover lost data more easily.

We’ll go through the detailed steps to get back deleted or lost data with professional data recovery software easily and effectively. Also, some alternative methods will be introduced, if you don’t want to install any software. We’ll also cover quick tips related to image recovery for beginners to eliminate data loss problems and some alternatives like using backup, CMD, etc. Starting recovering photos from USB Flash Drive with photo recovery software.

Part 1: Steps to recover deleted photos from USB Flash Drive with Wondershare Recoverit?

Wondershare Recoverit is the one-stop solution when it comes to how to recover lost data. Whether it is due to the random use of the “Shift + Delete” command or an empty recycle bin, virus attack or system crash, this tool can precisely handle it and save your data from any kinds of disaster. In addition, this data recovery software can recover more than 1000 file formats from almost all storage devices, be like USB flash drive, SD card, digital camera, etc.

The quick steps to use Wondershare Recoverit for recovering photos are:

Start by launching Recoverit Photo Recovery on your PC or Mac.

Select the location: Select the hard disk for recovering the deleted photos. Click on the USB flash drive option. Press the “Start” button to initiate the hard disk scanning.

Scanning the location: the software will start a thorough scan of the selected location to trace the deleted or deeply hidden or lost photos. It also allows you to preview the scanning results in this process.

Recovering the lost photos: Preview the recovered files and click on the “Recover” option to get the photos back.

Why choose Wondershare Recoverit?

Out of all the possible options available in the photo recovery software, Wondershare Recoverit is the preferred choice of millions users globally due to the following reasons:

Specialized in restoring lost photos, audio, videos, etc., from USB flash drives, SD cards, cameras, SSD, HDD, etc.

Performs comprehensive scanning to find deleted photos, videos, and audio files from external and internal devices.

Awarded 35 patents for innovative data recovery methods.

Support restoring data on crashed computers by creating a bootable USB drive.

Saves time by offering a quick preview of the files before recovery.

Part 2: How to recover deleted photos from USB Flash Drive without any software?

Hence, it is a simple and straightforward process to use Wondershare Recoverit for restoring photos in different storage devices. Readers looking for some quick alternatives to how to get deleted pictures back without software can go through the following two options:

Option1: Recovering from a backup

If you back up your system data regularly, it is easy to restore deleted photos from the system backup. Let us go through the quick steps to use the in-built backup and recovery tool for Windows:

Connect the storage media having system backup.

Open the Windows “Start” button and then press “Control Panel” > “System and Maintenance” > “Backup and Restore.”

Select from the “Restore my files” or “Restore all users’ files” option.

Search the lost photos using the “Browse for files” or “Browse for folders” option. Select “Restore” to recover the selected file.

Option 2: Recover deleted photos from CMD

After using the Windows in-built data backup and recovery tool, it is easy to recover photos from CMD. The command prompt feature works precisely for the data loss situations of hidden photos or corrupted photos. The quick steps for the same are:

Go to the “Start” menu and type “cmd” in the search bar.

Select the “Run as administrator” option.

Type “chkdsk *:/f” in the command prompt window. You must replace * with the hard drive letter in the command and press enter.

Now type “ATTRIB –H –R –S /S /D D:*.*. You must replace D with the hard drive letter and press enter.

It will immediately start the file restoration process.

Part 3: Tips to avoid deleted photos on your USB Flash Drive

After going through the detailed steps of how to recover photos from USB flash drives using different methods, it all comes down to some quick tips. So, below are some of the easiest tips on the permanently deleted photo recovery process:

Start by taking a backup of your photos on different devices before moving ahead with an external storage device. It is easy to connect the external USB device to your computer, take a quick backup, and use your system as the backup device.

It is recommended not to delete any photo from the photos storage location during backup to ensure a complete backup of images.

It is advised to save the recovered photos at a different place from the one where it gets deleted. It eliminates the repeat case of data loss.

Download image recovery software like Wondershare Recoverit at a safe location on the system. Further, it is recommended to go for a tool supporting multiple file formats.

AI and the Cloud Make USB Devices Even More Promising

There are a lot of reasons that cloud technology and AI are helping the USB market. These new technologies make it easier to backup and recover data from USBs.

Different reasons, like malware attacks, accidental use of the “Shift + Delete” command, or an empty recycle bin, may mark the need for an image recovery tool. Hence, different data loss scenarios can be quickly managed with the leading tool like Wondershare Recoverit, which can effectively recover deleted photos, audio, videos, etc. Further, it is easy to understand the different steps of how to recover permanently deleted photos from a USB flash drive.

This tool lets users quickly go through the detailed steps for photo recovery. No need to worry about your data when it is easy to recover the deleted photos from the backup and CMD on your system. Not to miss are the quick tips on image recovery.

The post Cloud and AI Technology Help USB Flash Drives Stay Relevant appeared first on SmartData Collective.

Source : SmartData Collective Read More

5 Considerations for Choosing the Right Data Center

5 Considerations for Choosing the Right Data Center

We have talked extensively about the benefits of data centers in the past. We mentioned that they are valuable for ecommerce businesses and many other sectors. However, many companies still don’t know how to choose them.

Many organizations are increasing their Big Data footprint and looking to data centers to help them grow. Global companies are projected to spend over $274 billion on big data this year and data cetners have played a role in this trend. There are many data centers available to all types of businesses now, but finding the best fit isn’t always the easiest decision. Housing your critical data and infrastructure in someone else’s facility necessitates a great deal of trust and thought, but not all data centers are created equal. Each facility has different standards and services, and you must consider several factors when selecting the best data center.

However, as data centers have grown in popularity and function, so has online and advertisement fraud. The electronic weaponization of data center traffic has played a significant role in the rise of ad fraud over the last decade.

Geographical Location

One of the most important considerations when choosing a data center is its location. While there are many data center locations worldwide, the right facility should have a physical location that someone from your company can easily access. This is a highly important consideration if you ever need to upgrade or service your equipment. IT redundancy is also crucial. If your on-premises production environment fails due to a disaster, such as a cyber-attack, make sure you have a plan in place to failover operations to a different data center.

Data Center Scalability

Your provider should be able to meet your current and future needs for at least several years to come. However, service providers typically provide varying degrees of scalability. While some will provide out-of-the-box solutions that may only meet your initial needs, others will provide customizable solutions that can expand and add more resources as needed, allowing your company to have enough resources when you need them the most.

These adaptable solutions can also scale down when the server load decreases, such as during peak and off-peak seasons of a ticket booking system. Having a service provider with this level of scalability will benefit your business during peak seasons and save you money during off-peak seasons.

Support and Uptime

Your data center should be available to provide your company with quick and efficient support. A Service Level Agreement is typically used to govern this. The ideal data service provider should offer in-house remote support with engineers available at all hours of the day. Furthermore, on-site assistance is required during emergencies or for important tasks, such as server checks, disaster recovery, and network maintenance. Note – In the IT industry, redundancy is critical. If your on-premise production environment fails due to a natural disaster, human error, or cyber-attacks, you must be able to remove operations to another data center quickly.

Reliability and Reputation

The reputation of your data center vendor is one of the most crucial factors too. Data center compliance can mean the difference between passing an audit and getting entangled in litigation. Security is also an essential consideration for data centers. For example, healthcare providers who handle sensitive patient data require data centers that are explicitly HIPAA-compliant. PCI-DSS compliance, on the other hand, is required for any organization that handles the transfer of credit card details. Even government agencies require special compliance standards to ensure data security, particularly the protection of Personally Identifiable Information (PII).

Network Interconnectivity and Connection Quality

The potential to interlink within a shared data center space is one of the most significant benefits of colocation. Interconnection can be highly beneficial to your business, whether you want to connect with partners, distributors, or even competitors for peering. Take the time to learn in-depth about the available connectivity options before making a decision. Does the vendor already have a large customer ecosystem that is interlinked? Is there a platform in place to facilitate multi-load connectedness and management?   

In Conclusion

Businesses and organizations need to carefully research their options when it comes to data center vendors. In today’s competitive market, many organizations utilize the public cloud to drive their businesses. Many of the giants in the market, such as Amazon Web Services (AWS) and Google, are setting cloud trends with innovative technologies backed by solid security. Choosing the right partner can place your business on the fast track to success.  

To learn more about data centers, what you need to consider when choosing one, and how to avoid online fraud, visit this blog by CHEQ.AI.

The post 5 Considerations for Choosing the Right Data Center appeared first on SmartData Collective.

Source : SmartData Collective Read More

Built with BigQuery: How Connected-Stories leverages Google Data Cloud and AI/ML for creating personalized Ad Experiences

Built with BigQuery: How Connected-Stories leverages Google Data Cloud and AI/ML for creating personalized Ad Experiences

Editor’s note: The post is part of a series highlighting our awesome partners, and their solutions, that are Built with BigQuery

In the field of producing engaging video content such as ads, many marketers ignore the power of data to improve their creative efforts to meet the consumers’ need for personalized messages. The demand for creative tech to efficiently personalize is real as marketers need personalized video Ads to reach their audience with the right message at the right time. Data, Insights and Technology are the main ingredients to deliver this value while ensuring security and privacy requirements are met. The Connected-Stories team partnered with Google Cloud to build a platform for Ad personalization. Google Data Cloud and BigQuery are at the forefront to assimilate data, leverage ML models, create personalized ads, and capitalize on real-time intelligence as the core features of the Connected-Stories NEXT platform.

Connected-Stories NEXT is an end-to-end creative management platform to develop, serve, and optimize interactive video and display ads that scale across any channel. The platform ingests first-party data to create custom ML models, measure numerous third-party data points to help brands develop unique customer journeys and create videos that their data signals can drive. An intelligent feedback loop passes real-time data back, enabling brands to make data-driven and actionable video ads that take the brand’s campaigns to the next level.

The core use case of the NEXT platform revolves around collecting user’s interaction data and optimizing for precision and speed to create an actionable Ad experience that is personalized for each user. The platform processes complex data points to create interactive data visualizations that allow for accurate analysis. The platform uses Vertex AI to access managed tools, workflows, and infrastructure to build, deploy, and scale ML models that have improved the accuracy to identify segments for further analysis. 

The platform ingests 200M data events with peaks and valleys of activity. These events are processed to generate dashboards that enable users to visualize metrics based on filters in real-time. These dashboards have high performance requirements in terms of a responsive user interface under constantly changing data dimensions.

Google Cloud’s serverless stack coupled with limitless data cloud infrastructure has been the core to the NEXT platform’s data-driven innovation. The growing volume of data ingested, streamed and processed were scaled uniformly across the compute, storage and analytical layers of solution. A lean development team at Connected-Stories were able to focus all-in on the solution, while the serverless stack scaled, lowered attack service in terms of security and optimized the cost footprint through pay-as-you-go features. 

BigQuery has been the backbone to support the vast amounts of data spreading over multiple geos resulting in workloads running at petabyte scale. BigQuery’s fully managed serverless architecture, real-time streaming, built-in machine learning and rich business intelligence capabilities distinguishes itself from a cloud data warehouse. It is the foundation needed to approach data and serve users in an unlimited number of ways. For an application with zero tolerance for failure, given its fully managed nature, BigQuery handles replication, recovery, data distributed optimization and management. 

The platform’s requirements include the need for low maintenance, constantly ingesting and refreshing data and smart-tuning of aggregated data. These capabilities can be implemented by BigQuery’s materialized views feature. Materialized views are useful for precomputed views that regularly cache query results for better performance. These views possess the innate feature to read only the delta change from base tables and calculate the up-to-date aggregations. Materialized views impart faster outputs and consume fewer resources while reducing the cost footprint.

Some key considerations in using Google cloud and focusing on the Serverless stack include:  quick onboarding to development, prototyping in short sprints and ease of preparing data in a rapidly changing environment. Typical considerations around low code / no code include data transformation, aggregation and reduced deployment time. These considerations are fulfilled through  using serverless capabilities within Google Cloud such as PubSub, Cloud Storage, Cloud Run, Cloud Composer, Dataflow and BigQuery as described in the Architecture diagram below. The use of each of these components and services are described below.

Input/Ingest: At a high-level, microservices hosted in Cloud Run collect and aggregate incoming Ads events. 

Enrichment: The output of this stage is a Pub-Sub message enriched with more attributes based on a pre-configured campaign. 

Store: a Cloud Dataflow streaming job to create text files in Cloud Storage buckets. 

Trigger: Cloud Composer triggers the spark jobs based on text files to process and group them to produce desired output as one record per impression, a logical group of events. 

Deploy: Cloud Build is then used to automate all deployments. 

Thus far, all Google cloud managed services work together to ingest, store and trigger the orchestration, all of which are scalable based on configurations including autoscaling capabilities. 

Visualization: A visualization tool reads data from BigQuery to compute pre-aggregations required for each dashboard. 

Data Model Evolution considerations: Though the solution served the purpose of creating pre-aggregations, as the data model evolved by adding a column or creating a new table, it led to recreating pre-aggregations and querying the data again. Alternatively, creating aggregate tables as an extra output of current ETLs seemed like a viable option. However, this would increase the cost and complexity of jobs. A similar situation to reprocess or update aggregated tables would occur as data is updated. 

Precomputed views of data that is periodically cached are critical to reach the audience with the right message at the right time. 

Performance: In order to increase the performance of the platform, we need to have regularly precomputed views of the data, cached . 

Materialized Views: Consumers of these views needed faster response times, to consume fewer resources and output only the changes in comparison to a base table. BigQuery Materialized views were used to solve this very requirement. Materialized views have been highly leveraged to optimize the design resulting in lesser maintenance and access to fresh data with high performance with a relatively low technical investment in creating and maintaining SQL code. 

Dashboards: Application dashboards pointing to the Materialized views are highly performant and provide a view into fresh data. 

Custom Reports with Vertex AI Notebooks: Vertex AI notebooks directly read data from BigQuery to produce custom reports for a subset of customers. Vertex AI has been hugely beneficial to data analysts, where an environment with pre-installed libraries simplifies the readiness to use. Vertex AI Workbench notebooks are used to share these reports within the team allowing them to work always on the cloud without having the need to download data at any time. Besides, it increases the velocity to develop and test ML models faster.

The NEXT platform has yielded benefits such as customers having the ability to create unique consumer journeys powered by AI / ML personalization triggers, using first-party data and business intelligence tools to capitalize on real-time creative intelligence, which is a dashboard to measure campaign performance for cross-functional teams to analyze the impact of Ad content experience at a granular level. All of these while ensuring controlled access to data to enrich data without moving across clouds. The NEXT platform can keep up with increased demands for agility, scalability and reliability through the underlying usage of Google Cloud.

Partnering with Google, in the context of the Google Built with BigQuery program has surfaced the differentiated value in areas of creating interactive personalized Ads by using real-time data. In addition, by sharing this data across organizations as assets, ML models have fueled higher levels of innovation. Connected-Stories plan to deepen the penetration into the entire spectrum of services offered in the AI/ML area to enhance core functionality and provide newer capabilities to the platform. 

Click here to learn more about Connected-Stories NEXT Platform capabilities.

The Built with BigQuery Advantage for ISVs 

Through Built with BigQuery, launched in April ‘22 as part of Google Data Cloud Summit, Google is helping tech companies like Connected-Stories co-innovate in building  applications that leverage Google’s data cloud with simplified access to technology, helpful and dedicated engineering support, and joint go-to-market programs. Participating companies can:

Get started fast with a Google-funded, pre-configured sandbox. 

Accelerate product design and architecture through access to designated technical experts from the ISV Center of Excellence who can share insights from key use cases, architectural patterns, and best practices encountered in the field. 

Amplify success with joint marketing programs to drive awareness, generate demand, and increase adoption.

The Google Data Cloud spectrum of products and specifically BigQuery give ISVs the advantage of a powerful, highly scalable data warehouse that’s integrated with Google Cloud’s open, secure, sustainable platform. And with a huge and expanding partner ecosystem and support for multi-cloud, open source tools and APIs, Google provides technology companies the portability and extensibility they need to avoid data lock-in and exercise choice. 

We thank the Google Cloud and Connected-Stories team members who co-authored the blog: Connected-Stories: Luna Catini, Marketing Director, Google: Sujit Khasnis, Cloud Partner Engineering

Related Article

Built with BigQuery: How True Fit’s data journey unlocks partner growth

True Fit, a data-driven personalization platform built on Google Data Cloud to provide fit personalization for retailers by sharing curat…

Read Article

Source : Data Analytics Read More

How The FA is moving the goal posts with a data cloud approach in Qatar

How The FA is moving the goal posts with a data cloud approach in Qatar

We’re moments away from the kick-off of another historic tournament. After the England men’s football team reached the Euro 2020 final in last year’s pandemic-delayed competition, there is genuine confidence in a successful run in Qatar.

The Football Association (The FA) is the governing body of association football in England, and has left no stone unturned in its preparations; they have increasingly looked to physical performance data as a way to help support players on the pitch. Maintaining accurate and insightful information on fitness, conditioning, and nutrition also helps ensure player welfare – something that gets more important with every fixture in a tournament environment.

The need for improved understanding of how players are faring was the reason The FA set up the Performance Insights strand of its Physical Performance, Medicine, and Nutrition department during lockdown in 2020. And they used Google Cloud to help them revolutionize the way they capture, store, and process information.

A single 90-minute squad training session can generate millions of rows of data. In football, things change so quickly that this data begins to lose relevance as soon as the players are back in the dressing room. That’s why The FA needed a solution which could turn raw data into valuable, easy-to-understand insights. This led the team to BigQuery, Google Cloud’s data warehouse solution.

BigQuery enables The FA’s Performance Insights team to automate previously labor-intensive tasks, and for all the information to be stored in a single, centralized platform for the first time. By collating different data sources across The FA’s squads, there can be greater clarity and fewer siloes – everyone is working towards the same goals. 

 A unique solution for a unique tournament

Access to insights is vital in any tournament situation, but this year there is a need for speed like never before. 

Unlike previous tournaments, Qatar will start in the middle of domestic league seasons throughout the world. Traditionally, international sides are able to meet up for nearly a month between the end of the league season and the start of the tournament – a critical time to work on all aspects of team preparation, including tactics and conditioning. By contrast, this year the England players will have less than a week to train together before the first kick-off.

BigQuery allows The FA’s data scientists to combine data on many aspects of a player’s physical performance captured during a training camp, from intensity to recovery. This can enable more useful conversations on the ground and can help create more individualized player management. And by using BigQuery’s user-defined customisable functions, the same data can be tweaked and tailored to fit the needs across departments. 

This customizability provides a foundation for a truly ‘interdisciplinary’ team in which doctors, strength and conditioners, physios, psychologists, and nutritionists have a common understanding of the support a player needs.

Every minute will count during such a compressed training window, so automation is key. While BigQuery is the core product The FA uses to store and manipulate data, it’s just one part of a suite of Google Cloud products and APIs that help them easily turn data into insights. 

In-game and training performance data, along with data pertaining to players’ sleep, nutrition, recovery, and mental health can be captured and fed through Python, which links straight into BigQuery using its Pub/Sub functionality. BigQuery’s native connectors then stream insights to visual dashboards that convey them in a meaningful, tangible format.  

Before leveraging the power of Google Cloud, this work could take several hours each day. Now, it can take a minute from data capture to the coaches having access to clear and actionable information. 

Predicting a bright future for the Beautiful Game

We won’t have long to wait to see how England will perform in Qatar. But the benefits of The FA’s cloud-enabled approach to data science will continue long after the final whistle has blown.

The short preparation window has posed challenges for The FA, but it has also given the organization a unique opportunity to discover how predictive analytics and machine learning on Google Cloud could further enhance its player performance strategy. 

The Physical Performance, Medicine, and Nutrition department has collected performance data from players throughout this year’s league season, taking into account fixture density and expected physical demand. They hope to use this to support the players’ physical preparation and recovery during the tournament based on individual physical performance profiles.

This ML work is still in the early stages. But the Performance Insights team is confident that by developing even closer relationships with Google Cloud and even greater familiarity with its technology, they will be able to unlock an even greater level of insight into player performance.

Learn more about how Google Cloud can turn raw data into actionable insights, fast.

Source : Data Analytics Read More

What can you build with the new Google Cloud developer subscription?

What can you build with the new Google Cloud developer subscription?

To help you grow and build faster – and take advantage of the 123 product announcements from Next ‘22 – last month we launched theGoogle Cloud Skills Boost annual subscription with new Innovators Plus benefits. We’re already hearing rave reviews from subscribers from England to Indonesia, and want to share what others are learning and doing to help inspire your next wave of Google Cloud learning and creativity.

First, here’s a summary of what the Google Cloud Skills Boost annual subscription1 with Innovators Plus benefits includes;

Access to 700+ hands-on labs, skill badges, and courses

$500 Google Cloud credits

A Google Cloud certification exam voucher

Bonus $500 Google Cloud credits after the first certification earned each year

Live learning events led by Google Cloud experts

Quarterly technical briefings hosted by Google Cloud executives

Celebrating learning achievements

Subscribers get access to everything needed to prepare for a Google Cloud certification exam, which are among the top paying IT certifications in 20222. Subscribers also receive a certification exam voucher to redeem when booking the exam.

Jochen Kirstätter, a Google Developer Expert and Innovator Champion is using the subscription to prepare for his next Google Cloud Professional certification exam, and has found the labs and courses on Google Cloud Skills Boost have helped him feel ready to go get #GoogleCloudCertified 

“‘The only frontiers are in your mind’ – with the benefits of #InnovatorsPlus I can explore more services and practice real-life scenarios intensively for another Google Cloud Professional certification.”

Martin Coombes, a web developer from PageHub Design, is a new subscriber and has already become certified as a Cloud Digital Leader. That means he’s been able to unlock the bonus $500 of Google Cloud credit benefit to use on his next project. 

“For me, purchasing the annual subscription was a no brainer. The #InnovatorsPlus benefits more than pay back the investment and I’ve managed to get my first Google Cloud certification within a week using the amazing Google Cloud Skills Boost learning resources. I’m looking forward to further progressing my knowledge of Google Cloud products.”

Experimenting and building with $500 of Google Cloud credits 

We know how important it is to learn by doing. And isn’t hands-on more fun? Another great benefit of the annual subscription is $500 of Google Cloud credits every year you are a subscriber. And even better, once you complete a Google Cloud certification, you will unlock a bonus $500 of credits to help build your next project just like Martin and Jeff did. 

Rendy Junior, Head of Data at Ruangguru and a Google Cloud Innovator Champion, has already been able to apply the credits to an interesting data analysis project he’s working on. 

“I used the Google Cloud credits to explore new features and data technology in DataPlex. I tried features such as governance federation and data governance whilst data is located in multiple places, even in different clouds. I also tried DataPlex data cataloging; I ran a DLP (Data Loss Prevention) inspection and fed the tag where data is sensitive into the DataPlex catalog. The credits enable me to do real world hands-on testing which is definitely helpful towards preparing for certification too.”

Jeff Zemerick, recently discovered the subscription and has been able to achieve his Professional Cloud Database certification using the voucher and Google Cloud credits to prepare.  

“I was preparing for the Google Cloud Certified Professional Cloud Database exam and the exam voucher was almost worth it by itself. I used some of the $500 cloud credits to prepare for the exam by learning about some of the Google Cloud services where I felt I might need more hands-on experience. I will be using the rest of the credits and the additional $500 I received from passing the exam to help further the development of our software to identify and redact sensitive information in the Google Cloud environment. I’m looking forward to using the materials available in Google Cloud Skills Boost to continue growing my Google Cloud skills!”

Grow your cloud skills with live learning events 

Subscribers gain access to live learning events, where a Google Cloud trainer teaches popular topics in a virtual classroom environment. Live-learning events cover topics like BigQuery, Kubernetes, CloudRun, Cloud Storage, networking and security. We’ve set these up to go deep: mini live-learning courses consist of two highly efficient hours of interactive instruction, and gamified live learning events are three hours of challenges and fun. We’ve already had over 400 annual subscribers reserve a spot for upcoming live learning events. Seats are filling up fast for the November and December events, so claim yours before it’s too late. 

Shape the future of Google Cloud products through the quarterly technical briefings  

As a subscriber, you are invited to join quarterly technical briefings, getting insight into the latest product developments and new features, with the opportunity for subscribers to engage and shape future product development for Google Cloud. Coming up this quarter, get face time with Matt Thompson, Google Cloud’s Director of Developer Adoption, who will demonstrate some of the best replicable uses of Google Cloud he’s seen from leading developers. 

Start your subscription today 

Take charge of your cloud career today by visiting cloudskillsboost.google to get started with your annual subscription. Make sure to activate your Innovators Plus badge once you do and enjoy your new benefits. 

1. Subject to eligibility limitations. 
2. Based on responses from the Global Knowledge 2022 IT Skills and Salary Survey.

Source : Data Analytics Read More

Accelerate innovation in life sciences with Google Cloud

Accelerate innovation in life sciences with Google Cloud

The last few years have underscored the importance of speed in bringing new drugs and medical devices to market, while ensuring safety and efficacy. Over this time, healthcare and life sciences organizations have transformed the way they research, develop, and deliver patient care by embracing agility and innovation. 

Now, the industry is set to reap the benefits of cloud technology and overcome the existing barriers to innovation.

Watch a 2-min overview of how Google Cloud helps life sciences accelerate innovation across the value chain.

What’s holding back innovation?

Costly clinical trials: The process of trialing and developing new drugs and devices is still long and costly, with more than 1 in 5 clinical trials failing due to a lack of funding.1 The high failure rate comes as no surprise when you consider the average clinical trial costs $19 million and takes 10-15 years (through all 3 phases) to be approved.2

Stringent security requirements: Pre-clinical R&D and clinical trials use large volumes of highly sensitive patient data – making the life sciences industry one of the top sectors targeted by hackers.3 On top of this, the FDA and other regulatory bodies have strict requirements for medical device cybersecurity. 

Unpredictable supply chains: Global supply chains are becoming increasingly complex and unpredictable. This can be brought on by anything from supply shortages, to geo-political events, and even bad weather. Making things worse is the lack of visibility into medical shipment disruptions – so when disaster strikes you’re often caught off guard.

Google Cloud for life sciences

At Alphabet, we’ve made significant investments in healthcare and life sciences, helping to tackle the world’s biggest healthcare problems, from chronic disease management, to precision medicine, to protein folding. 

Together with Google, you can transform your life sciences organization and deliver secure, data-driven innovation across the value chain. 

Accelerate clinical trials to deliver life-saving treatments faster and at less cost. Clinical trials require relevant and equitable patient cohorts that can produce clinically valid data. Solutions like DocAI can enable optimal patient matching for clinical trials, helping organizations optimize clinical trial selection and increase time to value.  How that patient data is collected is also important.  Collection in a physician’s office captures a snapshot of the participant’s data at one point in time and doesn’t necessarily account for daily lifestyle variables. Fitbit, used in more than 1,500 published studies–more than any other wearable device–can enrich clinical trial endpoints with new insights from longitudinal lifestyle data, which can help improve patient retention and compliance with study protocols. We have introduced Device Connect for Fitbit, which empowers healthcare and life sciences enterprises with accelerated analytics and insights to help people live healthier lives. We are able to empower organizations to improve clinical trials in key ways: 

Enable clinical trial managers to quickly create and launch mobile and web RWE collection mechanism for patient reported outcomes

Enable privacy controls with Cloud Healthcare Consent API and, as needed, remove PHI using Cloud Healthcare De-identification API 

Ingest RWE and data into BigQuery for analysis

Leverage Looker to enable quick visualization and powerful analysis of a study’s progress and results

Ensure security and privacy for a safe, coordinated, and compliant approach to digital transformation. Google Cloud offers customers a comprehensive set of services including pioneering capabilities such as BeyondCorp Enterprise for Zero Trust and VirusTotal for malicious content and software vulnerabilities; Chronicle’s security analytics and automation coupled with services such as Security Command Center to help organizations detect and protect themselves from cyber threats; as well as expertise from Google Cloud’s Cybersecurity Action Team. Google Cloud also recently acquired Mandiant, a leader in dynamic cyber defense, threat intelligence and incident response services.

Optimize supply chains and enhance your data to prepare for the unpredictable. With a digital supply chain platform, we can empower supply chain professionals to solve problems in real time including visibility and advanced analytics, alert-based event management, collaboration between teams and partners, and AI-driven optimization and simulation.

Ready to learn more? We’ll be taking a deep dive into each of the challenges outlined above in our life sciences video series. Stay tuned.

1. National Library of Medicine
2. How much does a clinical trial cost?
3. Life Sciences Industry Becomes Latest Arena in Hackers’ Digital Warfare

Related Article

How Google Cloud and Fitbit are building a better view of health for hospitals, with analytics and insights in the cloud

Exploratory research into detection and prevention of cardiovascular diseases using the latest wearable technology and AI-driven analytics.

Read Article

Source : Data Analytics Read More

BigQuery helps Soundtrack Your Brand hit the high notes without breaking a sweat

BigQuery helps Soundtrack Your Brand hit the high notes without breaking a sweat

Editor’s note: Soundtrack Your Brand is an award-winning streaming service with the world’s largest  licensed music catalog built just for businesses, backed by Spotify. Today, we hear how BigQuery has been a foundational component in helping them transform big data into music. 

Soundtrack Your Brand is a music company at its heart, but big data is our soul. Playing the right music at the right time has a huge influence on the emotions a brand inspires, the overall customer experience, and sales.  We have a catalog of over 58 million songs and their associated metadata from our music providers and a vast amount of user data that helps us deliver personalized recommendations, curate playlists and stations, and even generate listening schedules. As an example, through our Schedules feature our customers can set up what to play during the week.  Taking that one step further, we provide suggestions on what to use in different time slots and recommend entire schedules.

Using BigQuery, we built a data lake to empower our employees to access all this content and metadata in a structured way. Ensuring that our data is easily discoverable and accessible allows us to build any type of analytics or machine learning (ML) use case and run queries reliably and consistently across the complete data set. Today, our users are benefiting from this advanced analytics through the personalized recommendations we offer across our core features: Home, Search, Playlists, Stations, and Schedules.

Fine-tuning developer productivity

The biggest business value that comes from BigQuery is how much it speeds up our development capabilities and allows us to ship features faster. In the past 3 years, we have built more than 150 pipelines and more than 30 new APIs within our ML and data teams that total about 10 people. That is an impressive rate of a new pipeline every week and a new API every month.  With everything in BigQuery, it’s easy to simply write SQL and have it be orchestrated within a CI/CD toolchain to automate our data processing pipelines. An in-house tool built as a github template, in many ways very similar to Dataform, helps us build very complex ETL processes in minutes, significantly reducing the time spent on data wrangling. 

BigQuery acts as a cornerstone for our entire data ecosystem, a place to anchor all our data and be our single source of truth. This single source of truth has expanded the limits of what we can do with our data. Most of our pipelines start from a data lake, or end at a data lake, increasing re-usability of data and collaboration. For example, one of our interns built an entire churn prediction pipeline in a couple of days on top of existing tables that are produced daily. Nearly a year later, this pipeline is still running without failure largely due to its simplicity. The pipeline is BigQuery queries chained together into a BigQuery ML model running on a schedule withKubeflow Pipelines

Once we made BigQuery the anchor for our data operations, we discovered we could apply it to use cases that you might not expect, such as maintaining our configurations or supporting our content management system. For instance, we created a Google Sheet where our music experts are able to correct genre classification mistakes for songs by simply adding a row to a Google Sheet. Instead of hours or days to create a bespoke tool, we were able to set everything up in a few minutes. 

BigQuery’s ability to consume Excel spreadsheets allows business users who play key roles in improving our recommendations engine and curating our music, such as our content managers and DJs, to contribute to the data pipeline.

Another example is our use of BigQuery as an index for some of our large Cloud Storage buckets. By using cloud functions to subscribe to read/write events for a bucket, and writing those events to partitioned tables, our pipelines can easily and in a natural way quickly search and access files, such as downloading and processing the audio of new track releases. We also make use of Log Events when a table is added to a dataset to trigger pipelines that process data on demand, such as JSON/CSV files from some of our data providers that are newly imported into BQ. Being the place for all file integration and processing, BQ allows new data to be quickly available to our entire data ecosystem in a timely and cost effective manner while allowing for data retention, ETL, ACL and easy introspection.

BigQuery makes everything simple. We can make a quick partitioned table and run queries that use thousands of CPU hours to sift through a massive volume of data in seconds — and only pay a few dollars for the service. The result? Very quick, cost-effective ETL pipelines. 

In addition, centralizing all of our data in BigQuery makes it possible to easily establish connections between pipelines providing developers with a clear understanding of what specific type of data a pipeline will produce. If a developer wants a different outcome, she can copy the github template and change some settings to create a new, independent pipeline.

Another benefit is that developers don’t have to coordinate schedules or sync with each other’s pipelines: they just need to know that a table that is updated daily exists and can be relied on as a data source for an application. Each developer can progress their work independently without worrying about interfering with other developers’ use of the platform.

Making iteration our forte

Out of the box, BigQuery met and exceeded our performance expectations, but ML performance was the area that really took us by surprise. Suddenly, we found ourselves going through millions of rows in a few seconds, where the previous method might have taken an hour.  This performance boost ultimately led to us improving our artist clustering workload from more than 24 hours on a job running 100 CPU workers to 10 minutes on a BigQuery pipeline running inference queries in a loop until convergence.  This more than 140x performance improvement also came at 3% of the cost. 

Currently we have more than 100 Neural Network ML models being trained and run regularly in batch in BQML. This setup has become our favorite method for both fast prototyping and creating production ready models. Not only is it fast and easy to hypertune in BQML, but our benchmarks show comparable performance metrics to using our own Tensorflow code. We now use Tensorflow sparingly. Differences in input data can have an even greater impact on the experience of the end user than individual tweaks to the models. 

BigQuery’s performance makes it easy to iterate with the domain experts who help shape our recommendations engine or who are concerned about churn, as we are able to show them the outcome on our recommendations from changes to input data in real time. One of our favorite things to do is to build a Data Studio report that has the ML.predict query as part of its data source query. This report shows examples of good/bad predictions in the report along with bias/variance summaries and a series of drop-downs, thresholds and toggles to control the input features and the output threshold. We give that report to our team of domain experts to help manually tune the models, putting the model tuning right in the hands of the domain experts. Having humans in the loop has become trivial for our team. In addition to fast iteration, the BigQuery ML approach is also very low maintenance. You don’t need to write a lot of Python or Scala code or maintain and update multiple frameworks—everything can be written as SQL queries run against the data store.

Helping brands to beat the band—and the competition 

BigQuery has allowed us to establish a single source of truth for our company that our developers and domain experts can build on to create new and innovative applications that help our customers find the sound that fits their brand. 

Instead of cobbling together data from arbitrary sources, our developers now always start with a data set from BigQuery and build forward.  This guarantees the stability of our data pipeline and makes it possible to build outward into new applications with confidence. Moreover, the performance of BigQuery means domain experts can interact with the analytics and applications that developers create more easily and see the results of their recommended improvements to ML models or data inputs quickly. This rapid iteration drives better business results, keeps our developers and domain experts aligned, and ensures Soundtrack Your Brand keeps delivering sound that stands out from the crowd.

Related Article

How Telus Insights is using BigQuery to deliver on the potential of real-world big data

BigQuery’s impressive performance reduces processing time from months to hours and delivers on-demand real-world insights for Telus.

Read Article

Source : Data Analytics Read More

Advances In AI Help Marketers With Live Streaming Video Marketing

Advances In AI Help Marketers With Live Streaming Video Marketing

The COVID-19 pandemic fundamentally altered the marketing landscape, and in many ways for the better. While live streaming and video marketing have long been a part of a marketers toolkit, the prolonged lockdowns, social distancing, and travel bans over the course of the pandemic helped thrust it into the limelight, resulting in widespread adoption and since making it indispensable for business. 

This quick adoption, and seamless integration into the marketing workflows of global organizations was largely made possible by the substantial advances in AI, especially in recent years. While often being subtle and detectable, these advancements have since paved the way for mainstream adoption of live streaming of video marketing and webinar content, with no going back to the old ways for most teams. 

Why Run A Webinar At All?

Even though webinars have been around since the early-2000s, the core crux of B2B marketing and enterprise sales still relied on trade shows, conferences, cold calls, and physical demonstrations. Webinars were mostly reserved for conducting classes, seminars, or courses with a distributed, or worldwide audience, albeit with limited capabilities owing to bandwidth restrictions.

Today, webinars represent a great way for marketers to get in touch with potential customers all over the world, make their pitches, demonstrate capabilities, and answer any questions seamlessly, and without any friction. For deeper insights on the use of webinars for marketing and lead generation, you can refer to ActualTech’s IT webinar guide to better understand the substantial use-cases.

AI Advances Helping With Live Streaming Video Marketing

While the COVID-19 was the catalyst that coerced professionals across the globe to adopt and accept webinars as a way to keep their sales and operations chugging along, it was the remarkable advances in the core tech that allowed for such a seamless transition.

Almost anyone who has been in touch with webinar and video conferencing tech over the past decade can vouch for the magnitude of changes witnessed in this segment over the past few years. Here are some of the notable advances that we know of in this regard.

1.      Increased Personalization

The first, foremost, and least subtle of the lot is the enhanced personalization capabilities when it comes to webinars.

On the same lines of Netflix understanding your watch history to recommend shows and movies that you need to watch next, AI has made phenomenal strides when it comes to personalized offerings for a wide range of people attending a free webinar.

2.      Tracking & Analytics

A webinar with 1,000s of attendees is often a treasure trove of data, everything from the levels of engagement, to time they spent listening stands to add substantial value to marketing teams, in order to better refine their content going forward.

There are certain segments that witness high engagement, and others that are a snooze-fest, knowing which is what can help hosts better plan their content, and maximize engagement in future sessions.

Advances in analytics in this regard can further help you ascertain the total leads, sales, and other conversions resulting from a webinar.

3.      Content Localization

One of the most remarkable aspects of advances in AI pertaining to live streaming is the real-time live captioning, and that too in a variety of local languages. If a section of your audience is unable to receive quality audio and video, they can always focus on the subtitles, or the automatically generated transcript to get a gist of your offerings.

Final Words

These are undeniably exciting times for both AI as well as the profession of marketing, and with flurry of activity taking place across AR, MR, the metaverse, and more, all of which will likely converge with AI in many ways, the future of marketing and communications is scary and invigorating at the same time.

The post Advances In AI Help Marketers With Live Streaming Video Marketing appeared first on SmartData Collective.

Source : SmartData Collective Read More