Blog

Go from logs to security insights faster with Dataform and Community Security Analytics

Go from logs to security insights faster with Dataform and Community Security Analytics

Making sense of security logs such as audit and network logs can be a challenge, given the volume, variety and velocity of valuable logs from your Google Cloud environment. To help accelerate the time to get security insights from your logs, the open-source Community Security Analytics (CSA) provides pre-built queries and reports you can use on top of Log Analytics powered by BigQuery. Customers and partners use CSA queries to help with data usage audits, threat detection and investigation, behavioral analytics and network forensics. It’s now easier than ever to deploy and operationalize CSA on BigQuery, with significant queries performance gains and cost savings.

In collaboration with Onix, a premier Google Cloud service partner, we’re delighted to announce CSA can now be deployed via Dataform, a BigQuery service and an open-source data modeling framework to manage the Extraction, Loading, and Transformation (ELT) process for your data. Now, you can automate the rollout of CSA reports and alerts with cost-efficient summary tables and entity lookup tables (e.g. unique users and IP addresses seen). Dataform handles the infrastructure and orchestration of ELT pipelines to filter, normalize and model log data starting with the raw logs in Log Analytics into curated and up-to-date BigQuery tables and views for the approximately 50 CSA use cases, as shown in the dependency tree below.

The best of Google Cloud for your logs analysis

The best cloud for your log analysis

Dataform alongside Log Analytics in Cloud Logging and BigQuery provide the best of Google Cloud for your logs management and analysis.

First, BigQuery provides the fully managed petabyte-scale centralized data warehouse to store all your logs (but also other security data like your SCC findings).

Then, Log Analytics from Cloud Logging provides a native and simple solution to route and analyze your logs in BigQuery by enabling you to analyze in place without exporting or duplicating logs, or worrying about partitioning, clustering or setting up search indexes.

Finally, Dataform sets up the log data modeling necessary to report, visualize, and alert on your logs using normalized, continuously updated summary tables derived from the raw logs.

Why deploy CSA with Dataform?

Optimize query cost and performance

By querying logs from the summary tables, the amount of data scanned is significantly reduced as opposed to querying the source BigQuery _AllLogs view. The data scanned is often less than 1% based on our internal test environment (example screenshot below) and in line with the nature of voluminous raw logs. 

Summary table for DNS logs enables 330x efficiency gains in data scanned by SQL queries

This leads to significant cost savings for read-heavy workloads such as reporting and alerting on logs. This is particularly important for customers leveraging BigQuery scheduled queries for continuous alerting, and/or a business intelligence (BI) tool on top of BigQuery such as Looker, or Grafana for monitoring.

Note: Unlike reporting and alerting, when it comes to ad-hoc search for troubleshooting or investigation, you can do so via Log Analytics user interface at no additional query cost.

Segregate logs by domains and drop sensitive fields

Logs commonly contain sensitive and confidential information. By having your log data stored in separate domain-specific tables, you can help ensure authorized users can only view the logs they need to view to perform their job. For example, a network forensics analyst may only need access to network logs as opposed to other sensitive logs like data audit logs. With Dataform for CSA, you can ensure that this separation of duties is enforced with table-level permissions, by providing them with read-only access to network activity summary tables (for CSA 6.*), but not to data usage summary tables (for CSA 5.*).

Furthermore, by summarizing the data over time — hourly or daily — you can eliminate potentially sensitive low-level information. For example, request metadata including caller IP and user agent is not captured in the user actions summary table (for CSA 4.01). This way, for example, an ML researcher performing behavioral analytics, can focus on user activities over time to look for any anomalies, without accessing personal user details such as IP addresses.

Unlock AI/ML and gen AI capabilities

Normalizing log data into simpler and smaller tables greatly accelerates time to value. For example, analyzing the summarized and normalized BigQuery table for user actions, that is 4_01_summary_daily table depicted below, is significantly simpler and delivers more insights than trying to analyze the _AllLogs BigQuery view in its original raw format. The latter has a complex (and sometimes obscure) schema including several nested records and JSON fields, which limits the ability to parse the logs and identify patterns.  

A snippet of source _AllLogs schema (left) compared to low-dimensional summary table (right)

The normalized logs allows you to scale ML opportunities both computationally  (because the dataset is smaller and simpler), as well as for ML researchers, who don’t need to be familiar with Cloud Logging log schema or LogEntry definition to analyze summary tables such as daily user actions.

This also enables gen AI opportunities such as using LLM models to generate SQL queries from natural language based on a given database schema. There’s a lot of ongoing research about using LLM models for text-to-SQL applications. Early research has shown promising results where simpler schema and distinct domain-specific datasets yielded reasonably accurate SQL queries.

How to get started

Before leveraging BigQuery Dataform for CSA, aggregate your logs in a central log bucket and create a linked BigQuery dataset provided by Log Analytics. If you haven’t done so, follow the steps to route your logs to a log bucket (select the Log Analytics tab) as part of the security log analytics solution guide.

See Getting started in the CSA Dataform README to start building your CSA tables and views off of the source logs view, i.e., the BigQuery view _AllLogs from Log Analytics. You can run Dataform through Google Cloud console (more common) or via the Dataform CLI.

You may want to use the Dataform CLI for a quick one-time or ad hoc Dataform execution to process historical logs, where you specify a desired lookback time window (default is 90 days). 

However, in most cases, you need to set up a Dataform repository via the Cloud console, as well as Dataform workflows for scheduled Dataform executions to process historical and streaming logs. Dataform workflows will continuously and incrementally update your target dataset with new data on a regular schedule, say hourly or daily. This enables you to continuously and cost-efficiently report on fresh data. The Cloud console Dataform page also allows you to manage your Dataform resources, edit your Dataform code inline, and visualize your dataset dependency tree (like the one shown above), all with access control via fine-grained Dataform IAM roles

Leverage partner delivery services

Get on the fast path to building your own security data lake and monitoring on BigQuery by using Dataform for CSA and leveraging specialized Google Cloud partners.

Onix, a Premier Google Cloud partner and a leading provider of data management and security solutions, is available to help customers leverage this new Dataform for CSA functionality including:

Implementing security foundations to deploy this CSA solution

Setting up your Google Cloud logs for security visibility and coverage

Deploying CSA with Dataform following Infrastructure as Code and data warehousing best practices.

Managing and scaling your reporting and alerting layer as part of your security foundation.

In summary, BigQuery natively stores your Google Cloud logs via Log Analytics, as well as your high-fidelity Security Command Center alerts. No matter the size of your organization, you can deploy CSA with Dataform today to report and alert on your Google Cloud security data. By leveraging specialized partners like Onix to help you design, build and implement your security analytics with CSA, your security data lake can be built to meet your specific security and compliance requirements today.

Source : Data Analytics Read More

Deliver trusted insights with Dataplex data profiling and automatic data quality

Deliver trusted insights with Dataplex data profiling and automatic data quality

We are excited to announce the general availability of data profiling and automatic data quality (AutoDQ) in Dataplex. These features enable Google Cloud customers to build trust in their analytical data in a scalable and automated manner. 

Power innovation, decision-making, and differentiated customer experience with high-quality data

Data quality has always been an essential foundation for successful analytics and ML models. In these past six months, the rapid rise of artificial intelligence (AI) has led to an explosion in the use of machine learning (ML) models. The importance of data quality in machine learning has become even more critical in recent months. Data scientists and analysts need to understand their data more deeply before building the models. It will ultimately lead to more accurate and reliable ML outcomes.

Dataplex data profiling and AutoDQ make it easy to build and maintain this information in a scalable and efficient manner. These features offer:

Reduction in time to insights about the data

Dataplex makes it easy and quick to go from data to its profile and quality. These features have zero-setup requirements and are easy to start within the BigQuery UI. Dataplex AutoDQ will get you started with intelligent rule recommendations and generate quality reports within a short time. Similarly, with a single click, Dataplex data profiling will generate meaningful insights like data distribution, top-N, unique percentages, etc.Rich platform capabilities
The underlying capabilities of the platform allow users to build an end-to-end solution with desired customizations. Dataplex AutoDQ enables a data quality solution from rules to reports to alerts. AutoDQ rules can also incorporate BigQuery MLfor advanced use cases. On the other hand, with the information Data profiling generates, you can build custom AI/ML models like drift detection for detecting meaningful shifts in your training data. Secure, performant, and efficient executionThese features are designed to work with petabyte-scale data without any data copy. While it leverages the scale-out power of BigQuery behind the scenes, it has zero impact on customers’ BigQuery slots and reservations.Dataplex data profiling and AutoDQ are powerful new tools that can help organizations to improve their data quality and build more accurate and reliable insights and models. 

What our customers have to say

Here is what some of our customers say about Dataplex data profiling and AutoDQ:

“At Paramount we have data coming from multiple vendors and data anomalies might occur from time to time with data from different sources and integration channels. We have started incorporating Dataplex AutoDQ and BigQuery ML to address the challenges to detect and get alerted on anomalies in real time. This is not only efficient but it will improve the accuracy of our data.”  – Bhaskara Peta, SVP Data Engineering, Paramount Global

“At Orange, we are always on the cutting edge of innovation and rely on trusted insights to power this innovation. As we move our data and AI workloads to GCP, we have been looking for an elegant, integrated data quality service to provide a seamless experience to our data engineering team. We started using Datalex AutoDQ at a very early stage and we believe it could become a strong basis in our journey to Data Democracy. We are also excited to continue partnering with Google on building a stronger and innovative roadmap!” – Guillaume Prévost, Lead Tech, Data and AI, Orange

New features

New and exciting additional features since the public preview include: 

Configure and view results in BigQuery UI in addition to Dataplex 

You can now perform data quality and data profiling tasks directly from BigQuery in addition to Dataplex. Data owners can configure their data scans and publish the latest results of their data profile and data quality scans next to the table information in BigQuery. This information can then be viewed by any user with the appropriate authorization, regardless of the project in which the table resides. This makes it easier for users to get started with data quality and data profiling, and it provides a more consistent experience across all of the tools they use to manage their data.

New deployment options

In addition to our rich UI, we also added support for creating, managing, and deploying data quality and data profiling scans using a variety of methods, including:

Terraform: A first-class Terraform operator for deploying and managing data quality and data profiling resources.

Python and Java client libraries: We provide client libraries for Python and Java that make it easy to interact with Dataplex data profiling and AutoDQ from your code.

CLI: We also have a comprehensive CLI that can be used to create, manage, and deploy scans from the command line.

YAML: When using the CLI or Terraform, you can create and manage your scans using a YAML-based specification.

We have also made Airflow operators available for data quality to allow engineers to build data-quality checks within their data production pipelines. The airflow operator gives data engineers more flexibility in using AutoDQ, making it easier to integrate data quality checks with their existing data pipelines.

New configuration options to save costs and/or protect sensitive data

We have enhanced the core capabilities of Dataplex data profiling and AutoDQ to make them more flexible and scalable.

Row filters: Users can now specify row filters to focus on data from certain segments or to eliminate certain data from the scan. This can be useful for tuning scans for specific use cases, such as compliance or privacy.

Column filters: You can now specify column filters to avoid publishing stats on sensitive columns. This can help to protect sensitive data from unauthorized access.

Sampling: You can now sample your data for quick tests to save costs. This can be useful for getting a quick overview of the data quality and data profile without having to scan the entire dataset.

Build your reports or downstream ML models 

Dataplex data profiling and AutoDQ can also export metrics to a BigQuery table. This makes it easy to build downstream applications that use the metrics, such as:

Drift detection: You can use BQML to build a model that predicts the expected values for the metrics. You can then use this model to detect any changes in the metrics that indicate data drift.

Data-quality dashboard: You can build a dashboard that visualizes the metrics for a data domain. This can help you to identify any data quality issues.

We are grateful to our customers for partnering with us to build data trust, and we are excited to make these features generally available so that even more customers can benefit.

Learn more 

Get started by creating a data profile or data quality scan on BigQuery public data 

Learn more about Dataplex Data profiling

Learn more about Dataplex AutoDQ

Git repo with sample scripts and airflow DAG

Source : Data Analytics Read More

Enhancing Google Cloud’s blockchain data offering with 11 new chains in BigQuery

Enhancing Google Cloud’s blockchain data offering with 11 new chains in BigQuery

Early in 2018, Google Cloud worked with the community to democratize blockchain data via our BigQuery public datasets; in 2019, we expanded with six more datasets. Today, we’ve added eleven more of the most in-demand blockchains to the BigQuery public datasets, in preview. And we’re making improvements to existing datasets in the program, too.

We’re doing this because blockchain foundations, Web3 analytics firms, partners, developers, and customers tell us they want a more comprehensive view across the crypto landscape, and to be able to query more chains. They want to answer complex questions and verify subjective claims such as “How many NFTs were minted today across three specific chains?” “How do transaction fees compare across chains?” and “How many active wallets are on the top EVM chains?” 

Having a more robust list of chains accessible via BigQuery and new ways to access data will help the Web3 community better answer these questions and others, without the overhead of operating nodes or maintaining an indexer. Customers can now query full on-chain transaction history off-chain to understand the flow of assets from one wallet to another, which tokens are most popular, and how users are interacting with smart contracts. 

Chain expansion

Here are the 11 in-demand chains we’re adding into the BigQuery public datasets:

Avalanche

Arbitrum

Cronos

Ethereum (Görli)

Fantom (Opera) 

Near 

Optimism

Polkadot

Polygon Mainnet 

Polygon Mumbai 

Tron

We’re also improving the current Bitcoin BigQuery dataset by adding Satoshis (sats) / Ordinals to the open-source blockchain-ETL datasets for developers to query. Ordinals, in their simplest state, are a numbering scheme for sats. 

Google Cloud managed datasets 

We want to provide users with a range of data options. In addition to community managed datasets on BigQuery, we are creating first party Google Cloud managed datasets that offer additional feature capabilities. For example, in addition to the existing Ethereum community dataset (crypto_ethereum), we created a Google Cloud managed Ethereum dataset (goog_ethereum_mainnet.us) which offers a full representation of the data model native to Ethereum with curated tables for events. Customers that are looking for richer analysis on Ethereum will be able to access derived data to easily query wallet balances, transactions related to specific tokens (ERC20, ERC721, ERC1155), or interactions with smart contracts. 

We want to provide fast and reliable enterprise-grade results for our customers and the Web3 community. Here’s an example of a query against the goog_ethereum_mainnet.us dataset:

Let’s say we want to know “How many ETH transactions are executed daily (last 7 days)?”

code_block[StructValue([(u’code’, u’SELECT DATE(block_timestamp) as date, COUNT(*) as txns FROM `bigquery-public-data.goog_blockchain_ethereum_mainnet_us.transactions`rnWHERE DATE(block_timestamp) > DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY)rnGROUP BY 1rnORDER BY 1 DESC;rnrnSELECT DATE(block_timestamp) as date, COUNT(*) as txns FROM `bigquery-public-data.crypto_ethereum.transactions`rnWHERE DATE(block_timestamp) > DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY)rnGROUP BY 1rnORDER BY 1 DESC;’), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3eceece3d6d0>)])]

On the results above you can see how using the goog_ dataset is faster and consumes less slot time, while also remaining competitive in terms of bytes processed.

More precise data 

We gathered feedback from customers and developers to understand pain points from the community and heard loud and clear that features such as numerical precision are important for more accurately calculating the pricing of certain coins. We are improving the precision of the blockchain datasets by launching UDF for better UNIT256 integration and BIGNUMERIC1 support. This will give customers access to longer decimal digits for their blockchain data and reduce rounding errors in computation.  

Making on-chain data more accessible off-chain

Today, customers interested in blockchain data must first get access to the right nodes, then develop and maintain an indexer that transforms the data into a queryable data model. They then repeat this process for every protocol they’re interested in. 

By leveraging our deep expertise in scalable data processing, we’re making on-chain data accessible off-chain for easier consumption and composability, enabling developers to access blockchain data without nodes. This means that customers can access blockchain data as easily as they would their own data. By joining chain data with application data, customers can get a complete picture of their users and their business.

Lastly, we have seen this data used in other end user applications such as Looker and Google Sheets

Building together 

For the past five years, we have supported the community through our public blockchain dataset offering, and we will continue to build on these efforts with a range of data options and user choice — from community-owned to Google managed high-quality alternatives and real-time data. We’re excited to work with partners who want to distribute public data for developers or monetize datasets for curated feeds and insights. We’re also here to partner with startups and data providers who want to build cloud-native distribution and syndication channels unique to Web3.

You can get started with our free tier and usage-based pricing. To gain early access to the new chains available on BigQuery, contact web3-trustedtester-data@google.com

1.  32 logical bytes

Source : Data Analytics Read More

Data Security Unveiled: Protecting Your Information in a Connected World

Data Security Unveiled: Protecting Your Information in a Connected World

Since the world is extremely interconnected because of technology, it also comes with cons, such as data breach that compromises your data. That is why the emphasis on data security cannot be emphasized enough. But how exactly do you protect your data from any kind of cyberattack?

Want to go beyond understanding data security? This article will provide you with more information about it. Keep reading below.

Defining Data Security

It’s common knowledge that many websites require some type of identification in order to purchase items or gain access to membership areas. By doing so, especially on a site without solid data security measures in place, you leave yourself vulnerable to theft and other assaults. But what does data security entail?

Information or data that has been secured against tampering or disclosure by unauthorized parties. It’s crucial because it prevents your sensitive information, including bank records and personal details, from falling into the wrong hands. Sensitive data can be safeguarded in a number of ways, including encoding, login credentials, and barriers.

In addition, it is crucial to protect private information while still allowing for easy access. Too much security could make it impossible to access stored data at a later date. The information could be lost, stolen, or tampered with if it is not sufficiently protected.

Most Common Ways Data Security Is Breached

Organizations today worry a lot about keeping their data safe. As cloud computing and other disruptive technologies gain traction, it is more crucial than ever to adopt stringent measures to safeguard private information. The widespread impact of daily data breaches, however, cannot be ignored.

The following methods account for the vast majority of known data breaches:

Cloud Data Breaches

A lot of hackers are clever and talented, so they can access your cloud storage facility if the data there isn’t adequate data security. This will result in your sensitive information falling into the wrong hands. This is highly possible if you choose to store your valuable information in online clouds.

Although clouds are the new way of having backups, they also have a downside because attackers can easily penetrate them if they are good at data breaches. Even if it is a cost-effective way, your data is compromised by any online malware attack, so it is not worth the risk.

Loss of Device

In the case of getting stolen or lost, your device may be put in the wrong hands, resulting in leaked sensitive personal information. These devices include smartphones, laptops, and tablets, especially when they’re not encrypted. The risk only increases when these devices leave an office and could be exposed to malware on public Wi-Fi networks or in transit between offices.

Malware Attacks

Hackers frequently use malware attacks to break into networks and steal private information. Phishing emails and seemingly innocuous links on websites that really house malware are common entry points for hackers. Understanding the characteristics of phishing emails will help you avoid falling for them and allowing hackers access to your system or network.

Engineered Attacks

Hackers have increasingly used social engineering to gain access to business networks in recent years. Phishing emails often contain or link to fraudulent websites intended to compromise systems and expose networks to additional assaults.

Conversely, they are phishing scams in telephone form rather than via electronic communication. People are easy prey because they rarely verify the identity of the person on the other end of the line before revealing sensitive information.

Cyberattacks

If malicious software is unsuccessful in penetrating your system, hackers may resort to brute force. Hackers or attackers will make it to a point that they will use any way possible to break into your system. This may include the usage of complex security combinations or use of other applications to penetrate what they need to break into.

Tips on Achieving Strong Data Security

Your data is always at risk, no matter where you are. Your data is always in demand, whether you are a website subscriber or a social media user. You need to take precautions to ensure that your private data will not be stolen and used inappropriately.

To protect your data, you can follow these following tips below:

Use a Strong Password

When it comes to making your passwords, it is vital to remember that you use a unique combination incorporating symbols or numbers so that it will be more secure. It should also be long enough, but make sure that you get to remember it all the time. You must also avoid using the exact same password among different platforms. If you use the same password for many accounts, hackers only need to crack one of them to access all of them.

Change Passwords Often

The importance of constantly changing your password is something that most individuals forget. You may not think about it, but it is a step in ensuring your data security. In most cases, if you often change your password, hackers won’t get a chance to steal your information because it will be hard for them to penetrate your account.

Additionally, by doing this as a habit, you get rest easily, knowing that your information is secured all the time. Also, avoid giving your information to any unreliable websites because it can potentially put your information at risk.

Update Software Regularly

By making sure that your software is up-to-date, there won’t be any vulnerable spots that hackers can use to access your valuable information. You must see to it that you get to update your software to the latest versions so that the risk will also lessen. With constant updates, you will have a strong firewall, as well as added protection from any cyber attacker.

Is Dependable Data Security Mandatory?

Data may now be accessed from anywhere with an internet connection, but once it has been hacked, there is no way to get it back. Particularly so while discussing sensitive or secret information. Protecting your data from intrusion and other undesirable issues requires solid data security. If your sensitive data is compromised, there could be problems. Since your data contains potentially damaging information, protecting it rigorously should be a key priority.

Conclusion

You should do everything in your power to keep your sensitive information secure. The risk of someone acquiring access to your data and using it unlawfully can be minimized by being selective about the sites and services you provide it to. In addition, you should make it a point to use strong, unique passwords wherever possible to protect yourself against intrusion. It is also important to note that don’t get tricked into clicking unwanted emails or websites to prevent any malware.

Source : SmartData Collective Read More

4 Ways AI Can Improve Your Marketing Strategy

4 Ways AI Can Improve Your Marketing Strategy

AI technology is rapidly changing the state of business. Last year, around 35% of businesses reported using AI to some degree. This figure is going to rise sharply as more companies discover the benefits of using various AI tools such as ChatGPT.

AI seems to be popping up in all sorts of places, including in marketers’ toolkits. We talked extensively about some of the benefits of using AI in marketing before. While a lot of the discussions are centered around the hype of AI in marketing and some of its applications may be overblown, it is also going to be a gamechanger for the future. Strategy is no longer something dreamed up by humans in a brainstorming room on a Tuesday afternoon. AI is getting into the game when it comes to ideation and implementation.

Ways that Marketers Are Using AI

For marketers worried about the profession’s future, know AI can only be a supercharged assistant at this stage. And you want this technology by your side because it can help you improve your promotional strategies in four ways. Check out how AI enhances marketing approaches below.

1. Create More Relevant Content

As we mentioned in the past, there are a ton of ways that AI can help with content creation. We even listed a number of AI tools that can help with creating great content, such as ChatGPT.

Part of an inbound marketing strategy is to draw leads to your website. Once they’re there, you want to keep them engaged enough to stay. Your online content, whether it’s a blog post or a landing page, is responsible for driving engagement levels. The content must be relevant to your audience’s interests, pain points, and search needs to do this.

A high bounce rate can indicate your website’s content isn’t targeting your audience as it should. Bounce rates are the percentage of people who land on a single page of digital content and leave. They don’t stick around to browse other pages or convert by taking a desired action, such as buying a product.

According to Hawke, average bounce rates have increased from 47% to 56% in 2023. This means brands aren’t captivating and targeting audiences with as relevant content as before. AI programs can help address this issue by refining keyword research and discovering target market insights. Instead of guessing what your market wants to read about and see online, AI can collectively analyze all your data. These sources include your customer service calls.

2. Streamline Everyday Tasks

Do you have a social media manager who’s overwhelmed by having to constantly post to platforms like Facebook and Instagram? This aspect of their job may take up too much of their time. If you’re a smaller company with a one-person social team, everything they’re responsible for can turn into unrealistic targets.

Today’s social media managers aren’t just overseeing digital content. They manage online customer service, digital marketing strategy, and social listening. The growth of social media is one of the reasons why social media manager roles are expected to expand. From 2020 to 2030, the growth of these roles will be a minimum of 10%

With increased staffing needs, you don’t want your team to be unable to effectively juggle competing tasks. AI can automate the most repetitive of them, including scheduled posts. Other possibilities include streamlining lead-gen campaigns, competitor research, and routine responses to customers’ questions. Your social media manager will have additional time to devote to and perfect digital content strategies.

3. Make Market Research a Breeze

Not doing your homework and expecting to ace the test is usually futile. There’s a high chance you won’t get the results you want. It’s like this with marketing strategies, too. You want to do your homework (aka market research) before you design and execute anything.

Why? Because even if you think you know your audience like the back of your hand, you probably don’t. There’s always something new to discover, and consumer preferences can change. Launching a different set of products, rebranding, and entering unfamiliar markets are actions companies routinely take. You can’t assume two-year-old research is going to do the trick.

However, gathering and analyzing all the data can become a cumbersome project for any team. You’re putting in months of work after you design surveys and collect third-party sources. Your data could become outdated or miss the mark a bit. Thankfully, AI tools can speed up the process of collecting up-to-date information.

AI will show what your clients ask about from conversations with your organization. These tools analyze your existing first-party data so you don’t waste time duplicating it. Your research will also be more accurate, allowing you to optimize your strategies. 

4. Predict Emerging Trends

AI isn’t exactly a crystal ball. Still, the technology does take large amounts of customer data and make predictions based on patterns. Known as predictive analytics, it’s one of AI’s most powerful capabilities. As this article from LinkedIn points out, AI can create a powerful advantage for companies trying to predict future trends. Think of it as the equivalent of human wisdom gained through experience. You can predict that x is likely to happen in a specific scenario because you’ve seen it play out before.

A brand like Coca-Cola might use AI’s predictive capabilities to introduce new flavors. A prepaid wireless company like Mint Mobile might see trends in consumers’ concerns about price. The carrier could launch a plan with a lower price as a test. If enough customers switch, AI’s predictive analytics prove correct.

Beyond products, companies can use AI’s predictions to further segment customers. If you can identify who’s most likely to remain brand loyal, you can target them with effective messaging. Automated mobile app notifications based on past purchases and browsing behaviors are examples.

Maybe the data shows a boost in the number of customers who swing by the coffee shop every Friday. AI can isolate who’s stopping by so you can target them with messages about an ongoing BOGO Fridays promo. The trends revealed by predictive analytics have the power to spark ideas about drip campaigns, new promotions, and strategic overhauls.

Refine Your Marketing Strategies With AI      

A marketing strategy is only as good as the information behind it. When plans fall short of performance goals, something usually goes awry with the data. Typically, you can point fingers at problems with research methodologies and analysis.

Add in overtaxed marketing teams, and you have another layer of complications. By taking advantage of AI’s capabilities, your company can improve its approaches to targeting and engaging its audiences. The results will speak for themselves.

Source : SmartData Collective Read More

IoT And Cloud Integration is the Future!

IoT And Cloud Integration is the Future!

In the last decade, we’ve observed an immense surge in technological advancements. From smart homes to wearables, cars to refrigerators, the Internet of Things (IoT) has successfully penetrated every facet of our lives. But have you ever wondered what makes all this possible? What powers these devices to become more intelligent, responsive, and adaptable? The answer: Cloud Computing.

The market for the Internet of Things (IoT) has exploded in recent years. Precedence Research projects that it will be worth over $1.5 trillion by 2032.

Cloud computing offers unparalleled resources, scalability, and flexibility, making it the backbone of the IoT revolution. This synergy between IoT and cloud platforms is transforming industries, innovating solutions, and simplifying processes. Let’s explore how IoT and Cloud Integration work hand in hand to power the future:

Of course, there are a lot of risks associated with cloud computing. However, there are far more benefits that cannot be overlooked.

1. Endless Scalability: Cloud’s Biggest Offering To IoT

IoT devices produce an enormous amount of data every second. Processing, storing, and analyzing this data is a monumental task. This is where cloud platforms come to the rescue. 

Cloud platforms provide virtually limitless storage and processing capabilities, ensuring that as the number of connected devices grow, the infrastructure can adapt and handle it. The seamless scalability offered by the cloud is indeed its most significant contribution to the IoT ecosystem. As cloud experts often say, the cloud’s capacity to scale is the secret sauce of IoT’s global reach and success.

2. Real-Time Processing And Analysis

One of the critical aspects of IoT is its need for real-time data processing and analytics. Whether it’s a health monitor alerting a user of abnormal heart rates or a smart home system adjusting room temperatures, the reaction time is paramount. 

Cloud platforms are built to handle real-time data analytics, processing information as it comes in and providing immediate insights. This instant feedback loop is crucial for IoT devices to function optimally and improve user experiences.

3. Cost-Effective Storage Solutions

With billions of devices producing data, the storage demands are bound to be colossal. Traditional storage solutions would be not only inefficient but also exorbitantly expensive. 

Cloud computing provides a cost-effective alternative. Instead of maintaining physical servers, organizations can rent storage space in the cloud, only paying for what they use. As data storage needs increase or decrease, companies can easily adjust their storage plans, ensuring both cost-efficiency and flexibility.

4. Enhanced Security And Compliance

Security is a significant concern when it comes to IoT. The vast number of interconnected devices creates multiple entry points for potential cyber threats. 

However, cloud platforms are continually investing in advanced security measures. From encryption to firewalls, identity and access management, and regular compliance checks, cloud platforms offer a robust security infrastructure. It ensures that data transmitted from IoT devices is well-protected against breaches and cyberattacks.

5. Seamless Integration And Interoperability

As the IoT ecosystem expands, devices from different manufacturers need to communicate effectively. Cloud platforms provide tools and frameworks that ensure seamless integration and interoperability. This means that a wearable from Company A can effectively communicate with a smart home system from Company B, making the IoT ecosystem truly interconnected and cohesive.

6. Remote Access And Control

One of the core advantages of the IoT is the ability to access and control devices remotely. Cloud platforms make this possible by hosting applications and interfaces that can be accessed from anywhere, any time. 

Whether it’s adjusting your home’s thermostat while you’re on vacation or a manufacturer monitoring equipment in a remote factory, cloud platforms enable 24/7 access, ensuring control and monitoring in real-time.

7. Future-Proofing With Updates And Patches

IoT devices require regular software updates to improve functionalities and fix vulnerabilities. Distributing these updates can be challenging, especially with devices scattered globally. Cloud platforms offer a centralized system where updates and patches can be rolled out universally, ensuring that devices stay up-to-date and secure.

Conclusion

There are so many gamechanging new technologies that have shaped our world in recent years. One of the biggest has been the Internet of Things. In fact, we have gone on to say that the IoT is probably the most impactful new technology of the 21st Century.

The confluence of IoT and Cloud Computing is undeniably ushering in a new era of technological innovation. As more devices become interconnected, the reliance on the cloud will only grow. With its scalability, real-time processing capabilities, cost-effectiveness, and advanced security measures, the cloud is perfectly poised to power the next phase of the IoT revolution.

Source : SmartData Collective Read More

7 Major IT Infrastructure Challenges for Data-Driven Companies

7 Major IT Infrastructure Challenges for Data-Driven Companies

Big data technology has been a huge gamechanger for countless companies in every sector. Around 60% of companies rely heavily on data analytics technology to meet their goals.

However, despite, the many benefits of big data technology, many companies still have difficulty implementing it properly. Only 13% of companies that have instituted data strategies are delivering on them.

There are a number of reasons that companies have difficulty meeting their objectives with big data. One of the biggest issues is that they don’t manage their IT infrastructure properly. This can lead to a number of problems, such as data storage issues.

IT Infrastructure Issues Are a Major Source of Problems for Data-Driven Companies

Your business relies heavily on its information technology, and its IT lives and dies by the maintenance of its infrastructure. If you and your IT team improperly manage your business’s IT infrastructure, you will suffer unbearable amounts of downtime that will cut deep wounds in your budget and potentially cause your business to fail.

The first step to setting your IT infrastructure up for success is to understand some of the most significant challenges affecting infrastructure management that can impact your data strategy. You can read about those here, so you can make important changes that keep your business alive and well:

Network Access

It should go without saying that high-speed internet is essential for smooth operations — and yet, many companies continue to struggle with weak internet connections for a variety of reasons. In any environment, it is possible to achieve fast internet speeds, and an experienced team of IT professionals can help your business achieve and maintain a fast and efficient connection to the internet.

Data Management

Many businesses run on TCP/IT networks, which utilize routers and switches to protect emails, apps, internet browsing and more. Unfortunately, these components can radically reduce network speeds, negatively impacting workflows and making online communication difficult. Worse, when poorly managed, these networks can lose packets of information, and substantial data loss is dangerous to an organization in dozens of ways. A secure IT infrastructure needs to upgrade its networks to the latest high-speed, high-security options.

Workforce Retention

It may sound facile, but your workforce is the most valuable resource you have for maintaining a secure and efficient IT infrastructure. Without knowledgeable and skilled IT staff who have a deep and thorough understanding of your infrastructure, your infrastructure is likely to suffer in various ways. Therefore, you must make it a priority to attract top IT talent and retain it. You may need to reconsider how you approach IT recruitment and create systems for rewarding and empowering existing staff to ensure they feel engaged and excited to contribute to your company into the future.

Hardware Age

Computers and other devices have estimated lifetimes of between three and five years. This is not necessarily because devices break down so severely after this relatively short period of time but more often because the new hardware available on the market is so much more advanced and offers compelling advantages like speed and power that can enhance productivity. However, replacing an entire organization’s hardware every few years is a monumental endeavor that is expensive and time-consuming. It might be more efficient for IT to focus on repairing hardware as necessary and only replacing hardware that is supremely out of date.

Workforce Mobility

Even before the pandemic, many businesses were moving toward increased mobility of their workforce, with more and more mobile devices added to business networks. Today, in addition to workforce mobility, more than 16 percent of companies have to contend with a fully remote workforce, which offers a tremendous challenge to IT staff tasked with maintaining integrity and cohesion in an infrastructure. IT needs to be able to develop strong policies for the types of devices and applications permitted on the network and carefully controlled strategies for securing all manner of devices and data.

Cloud Computing

Almost every business utilizes the cloud to some degree, but rarely is cloud usage straightforward. More often, cloud computing involves a diverse environment of public and private clouds as well as on-site data storage to accommodate all business needs. IT teams should work with business leaders to manage their existing cloud systems effectively, which may involve finding solutions for consolidating and simplifying the cloud to enhance its security.

Capital Investment

You must strike a careful balance in how much capital you invest into your IT infrastructure. Too much funding could overwhelm your IT team and overcomplicate your IT infrastructure, resulting in more technology than your business needs. Then again, not enough capital will force your IT team to cut corners that will inevitably result in inefficiencies and insecurities that will hamper your business growth. You need to ensure that you have IT leaders that you trust, and you may want to welcome IT into upper management with positions in the c-suite such as CTO, CIO, CSO and CISO.

As tempting as it might be to imagine IT infrastructure as a one-and-done project — something your IT staff can construct and forget — the truth is that infrastructure requires careful and continuous maintenance. By paying attention to some of the most pressing IT infrastructure problems, you can keep your business’s technology up and running for years to come.

Data-Driven Companies Need to Make it a Priority to Manage their IT Infrastructure

There are a number of challenges that you need to contend with when you are trying to run a data-driven organization. You will need to implement the right IT management practices to handle data effectively. Fortunately, you will have an easier time if you are willing to follow the guidelines listed above.

Source : SmartData Collective Read More

Green Data Centers Make Data-Driven Entities More Sustainable

Green Data Centers Make Data-Driven Entities More Sustainable

The proliferation of big data has had a huge impact on modern businesses. We have a post on some of the industries that have been most affected by big data. However, it has also created some concerns about sustainability.

Of course, there are some reasons big data can help make our communities more sustainable. On the other hand, data centers have large carbon footprints, so companies need to find ways to make them more energy efficient.

The Importance of Green Data Centers for Data-Driven Businesses

By the year 2023, the global data center colocation market will reach $131.80 billion – but as powerful for resources management and bolstering security as DC facilities are, they’re also very harmful to our environment. Researchers estimate that in 2025, they will consume around 20% of the world’s power supply. To stop – or at least minimize – their negative impact on the planet Earth, more and more organizations steer toward greener data center solutions. What makes them different from traditional data centers? And what are the benefits of investing in sustainable DC facilities? Let’s embark on a journey to understand this pivotal transformation.

The digital revolution is nowhere near slowing down, with data centers spearheading the rapid advancements in IT resources. While this growth trajectory is met with enthusiasm in many circles, it also raises environmental concerns. And those are not without cause, as data centers already account for 4% of global energy consumption and 1% of global greenhouse gas emissions. While these numbers may not seem high at first, they may become troubling when we think of how quickly data centers are growing – and according to the Globe News Wire, their global colocation market will expand with a compound annual growth rate of 6.5% from 2021 to 2027. That’s why it’s so crucial to look into solutions that help counteract the rising threat that DCs pose to our planet’s environmental stability.

Leading Initiatives in Data Center Sustainability

The transition to a green data center is more than just a trend; it’s a necessity acknowledged and embraced by industry forerunners such as Meta and Microsoft. Here, we highlight some of the most important data center operators’ initiatives:

Meta: In 2022, Meta showcased data centers with an impressive Power Usage Effectiveness (PUE) of 1.09 and a Water Usage Effectiveness (WUE) of 0.20. These figures represent a significant stride in optimizing energy and water utilization in data center operations.

Microsoft: Setting a benchmark in renewable energy utilization, Microsoft has projected that by 2025, its data centers in Ireland will be fully powered by renewable energy sources. This initiative is backed by new projects supported by Power Purchase Agreements (PPAs).

Global Investment: A study conducted by Arizton Advisory & Intelligence predicts that the worldwide investment in the green data center sector will escalate to an estimated $35.58 billion by 2027.

These are just a few initiatives taken on by the industry leaders, which indicate a trend that will probably engulf the sector in the near future. As more companies join this green revolution, the emphasis on sustainability within data center operations is definitely going to amplify.

What Makes a Data Center Sustainable?

With green data centers gaining momentum, some may ask what this trend is even about. Below, we’re listing the top factors that contribute to data center sustainability, as well as exploring the complex interplay of technological advancements, government policies, and the existing challenges and solutions involved in this transformation.

Advanced Cooling Techniques
A sustainable data center are increasingly adopting liquid cooling systems and heat reuse solutions that are not only energy-efficient but also significantly reduce the carbon footprint.

Energy-Efficient Hardware
The heart of these centers lies in their utilization of energy-efficient components. This involves harnessing processors and storage devices that offer superior performance while consuming a fraction of the energy utilized by conventional counterparts. Energy efficiency is one of the top conditions a data center must meet in order to be recognized as sustainable.

Renewable Energy Integration
A monumental shift is noticed in the power sources of these data centers, with a considerable emphasis on tapping into renewable energy avenues such as wind, solar, and geothermal energies, thereby diminishing their reliance on fossil fuels. This initatives help maximize energy efficiency and minimize carbon footpring simultaneously.

Waste Management & Recycling
Embarking on responsible waste management practices is an intrinsic aspect, which encompasses diligent recycling and disposal procedures to prevent environmental degradation and foster a circular economy.

Intelligent Monitoring Systems
Implementing intelligent systems that offer real-time data on various environmental factors, assisting in pinpointing inefficiencies and creating avenues for improvements, thus setting a benchmark in environmental responsibility within the tech industry.

Advantages of Green Data Center Integration

Green data centers are a great way to reduce energy consumption and lessen the environmental impact of traditional DC facilities. But what about business? Are there any tangible upsides to implementing sustainable data center solutions in and for your company? Here are some of the rewards you can reap by integrating them into your business strategy:

1. Gaining a competitive advantage

Green data centers are the bedrock of cutting-edge technology. The integration of state-of-the-art IT infrastructure guarantees not only higher reliability but also significantly improved uptime. That’s what poises your company as a leader in innovations and can help you attract new potential clients.

2. Optimized performance

Leveraging an eco-friendly data center is a surefire way to boost IT operations performance and minimize disruptions or downtime. The utilization of features like uninterruptible power supply fosters operational continuity in your business, which aids employee productivity and customer satisfaction.

3. A financial booster

Embracing green technology can lead to substantial long-term savings. Companies are finding that by minimizing resource wastage and optimizing cooling systems, they can significantly reduce operational expenses related to electricity and other resources. What’s more, various global initiatives offer incentives to encourage organizations worldwide to adopt green technology practices. Here are a few that mastered it:

US Federal Tax Incentives: Offers deductions for the incorporation of energy-efficient systems in business infrastructures, encouraging a shift towards greener technology.

Green Mark Scheme (Singapore): Recognizes and rewards companies for reducing environmental impacts, fostering a sustainable business ecosystem in Singapore.

Horizon 2020 (EU): Provides substantial funding for environmental R&D projects, promoting innovation and global competitiveness in the European region.

6. Adaptable Infrastructure

One of the best features of green data centers from the business POV is their modular designs and flexible infrastructure. This adaptive approach enables businesses to address unique needs and requirements better, offering a competitive edge in the dynamic digital landscape.

7. Enhanced Reputation

In the contemporary market, sustainability is more than a trend; it’s a demand. Companies making a shift towards green initiatives not only witness an improved brand reputation but also attract customers who prioritize environmental responsibility. After all, going green will always attract some welcome attention. Need proof? Just take a look at Apple, a company with a significant commitment to renewable energy, which has seen a surge in brand loyalty and customer approval, showcasing that sustainability and business success can – and do – go hand in hand. 

As the journey towards a greener future accelerates, understanding and leveraging the multifaceted benefits of a more eco data center can be a game-changer for businesses looking to thrive in an increasingly environmentally conscious market.

The Future Ahead: Innovations and Insights into Sustainable Data Centers

Sustainability is definitely THE trend to look out for in the future of data centers. However, though the solution bringing about so many environmental and business benefits, it poses many challenges as well. What direction will the green data centers take? Here are a couple of perspectives that are worth considering while looking into tomorrow of the data center industry.

Leveraging Artificial Intelligence and Machine Learning

Green data centers are increasingly incorporating Artificial Intelligence (AI) and Machine Learning (ML) to enhance efficiency and minimize energy consumption. Companies like Google have already set a benchmark by utilizing AI to reduce their data center’s energy usage by 15%, showcasing the potential of intelligent technologies in optimizing data center operations.

Geo-Distributed Data Centers

The advent of geo-distributed data centers is revolutionizing the way data is managed and stored. These centers not only facilitate reduced latency and improved data accessibility but also allow for the utilization of region-specific sustainable resources, creating a harmonious balance between technology and the environment.

Policy Influences and Government Initiatives

As governments globally tighten regulations surrounding environmental conservation, data centers are urged to align with these guidelines. For instance, the European Commission’s Green Digital Program aims to render the ICT sector climate-neutral by 2030, urging data centers to adopt more sustainable practices and thereby shaping the future trends in the sector.

Renewable Energy Integration

With a stern focus on reducing carbon footprints, many data centers are gradually transitioning to renewable energy sources. Microsoft, for instance, has committed to becoming carbon-negative by 2030, with plans to draw 60% of its data center energy from renewable resources, setting a precedent for others in the industry.

Collaborative Innovations

Collaborative efforts between industry giants are fostering innovation in the sector. Recent partnerships like that between Equinix and Singapore’s national research agency A*STAR aim to develop green data center technologies, signaling the promising future of collaborative research and development in this sector.

Summary

As we venture further into this era of green technology, an eco data center stands as a pivotal entity, continuously evolving to meet the environmental and technological demands of the future. It’s a journey of innovation, collaboration, and commitment towards fostering a greener, more sustainable world.

Prepared by: Data Center Team at Comarch

Source : SmartData Collective Read More

The Role of Data in Automating Healthcare Processes for Improved Patient Results

The Role of Data in Automating Healthcare Processes for Improved Patient Results

In recent years, data has become a more important aspect of business operations across a range of sectors. One space that is benefiting significantly from utilizing data in healthcare. In fact, healthcare organizations across the world are improving their operations as a result of using data to glean important insights.

However, while data has brought healthcare many benefits, one of the most important of these is improved patient results. Understanding how automation is benefiting patients is vital for understanding the immense impact that data is currently having on healthcare.

Here is the role of data in automating healthcare processes for improved patient results.

Understanding Robotic Process Automation

Robotic process automation is improving healthcare in a variety of exciting ways with the help of data. Essentially, robotic process automation is a process in which software is automated to perform repetitive tasks automatically without the interference of a human.

In healthcare, this technology is being used to automate a variety of processes. While this may seem insignificant at first glance, this improved efficiency can also cause healthcare organizations to become more productive. As a result, this means that patients receive better healthcare outcomes because of the improved care they receive.

One incredibly important way that robotic process automation is being used is in performing administrative tasks. This includes tasks such as scheduling appointments for patients and sending out billing information. To perform these tasks, data is quickly analyzed and used to glean pertinent information. As a result, healthcare workers can spend more time and energy caring for patients than engaging in these menial tasks.

Another incredibly significant way that healthcare organizations are using robotic process automation is to coordinate treatments for patients. This is immensely important for patients who need care from a variety of specialists to achieve optimal health outcomes. Now, instead of needing an employee to engage in these tasks, robotic process automation systems utilize health data to coordinate treatment automatically.

By taking some of the workload off of staff, healthcare facilities can save money by taking advantage of robotic process automation. In this way, these healthcare providers can even help patients reduce healthcare costs without having to sacrifice the standard of care they receive.

This is one of the many ways healthcare companies are now using data to improve their offerings. As a result, patients are now receiving improved care and health outcomes thanks to the utilization of data in specific and meaningful ways.

Data-Informed Healthcare Decisions

Today, patients rely on healthcare professionals’ expertise and specialized knowledge to decide the best course of their care. While these professionals are typically incredibly talented, human error can still sometimes occur, resulting in poorer health outcomes. Beyond this, crafting a treatment plan and making healthcare decisions for patients can also be a lengthy process, especially if it involves different types of specialists.

Fortunately, data has presented itself as an effective and accessible solution to this healthcare problem. Essentially, software is now capable of rapidly analyzing large health-related data sets. After conducting these analyses, this software can quickly provide healthcare professionals with insights about the best statistical healthcare options that patients have.

This can help streamline the process of making healthcare decisions while also significantly reducing, or completely eradicating, the chance of human error. As such, data is helping patients achieve better health outcomes by providing insights that can inform healthcare and treatment decisions.

Robot Nurses

Though it may sound more like fantasy than reality, technology is finally reaching the point in which robots can help treat patients. While this technology is still relatively new and isn’t being used in every healthcare facility, this will likely change over the next few years.

Robot nurses can perform a wide variety of tasks ranging from cleaning hospital rooms to running diagnostics on patients. These robots are programmed with health data to be able to provide patients with specific types of care based on their specific needs. In addition, these robots can be provided with facility-specific data to perform tasks based on need rather than being programmed to perform specific tasks at specific times.

The implications of this type of technology are immense, with robot nurses being able to help both healthcare workers and patients in a variety of ways. One specific way robot nurses can ensure that patients are receiving better care is by decreasing instances of nurse burnout. Essentially, these robots perform many tasks that human nurses currently have to perform, thereby decreasing their workload and giving them more opportunities for respite.

As this technology improves and becomes more robust, it’s likely nurse robots will become a staple in every healthcare facility and possibly even in the homes of patients with more acute healthcare needs. In this way, data is playing a vital role in automating healthcare processes and improving the level of care that patients receive.

Data is Helping Healthcare Become More Automated in Exciting Ways

While there are many exciting technologies now making the healthcare space more robust, those that utilize data are having some of the biggest impacts. From software that helps create perfect treatment plans for patients to robot nurses that help care for patients, data-driven technology is helping healthcare organizations become more efficient and resilient.

Hopefully, new ways of taking advantage of data in the healthcare space will be discovered in the near future and even more patients will benefit from experiencing optimal health outcomes.

Source : SmartData Collective Read More

NIST 800-171 Safeguards Help Non-Federal Networks Handling CUI

NIST 800-171 Safeguards Help Non-Federal Networks Handling CUI

Today, there is a pressing need for non-federal networks to utilize efficient cybersecurity measures to protect the controlled unclassified information (CUI). CUI is delicate yet unclassified government information involving matters like military equipment specifications. Although this information is unclassified, unauthorized access can have extreme economic and national security implications.

Due to the increase in alarming cyber attacks, the United States National Institute of Standards and Technology released the NIST Special Publication 800-171 to safeguard CUI in non-federal organizations and information systems. NIST 800-171 is a noteworthy framework that empowers organizations to have a firm cybersecurity posture. As you interact with this article, you will learn how the publication has successfully harnessed data security in non-federal organizations. 

Awareness and Training

Awareness and training are critical components of the NIST Special Publication 800-171 that guides non-federal organizations in working with CUI. It ensures that the personnel within these organizations get educated on the policies that revolve around information security, procedures, and best practices. It ensures that they are well informed about their roles in protecting CUI. They should also be able to recognize threats and respond to incidents. 

Access Control

The NIST 800-171 ensures that all non-federal organizations safeguard CUI by only allowing access to authorized persons in the organization. According to the publication, organizations must manage and restrict access to data, systems, and resources. They apply controls like:

User authentication: The personnel must always use multi-factor authentication to access the information. 

User authorization: Even though the personnel can access data, it is only restricted to the information relevant to their roles and responsibilities (the principle of least privilege).

Access permission: The publication dictates that organizations must have set parameters that dictate who views, modifies, or deletes CUI. Again, access depends on the roles and responsibilities of the staff.

Configuration Management

Configuration management is another safeguard in the NIST 800-171 publication. It guides non-federal networks in establishing and maintaining secure software and hardware systems configurations. It ensures that the organizations have maintained an inventory of all the authorized devices within every network that is up-to-date. Devices include workstations, servers, switches, routers, and others. 

Non-federal organizations should also keep an inventory of all the authorized devices and software applications in every network device. That way, the system can automatically detect and remove any unauthorized software. 

Audit and Accountability

The special publication requires that non-federal organizations follow the stipulated guidelines for securing controlled, unclassified information. In this safeguard, strong audit and accountability techniques are established to monitor and keep track of the activities and events related to security.

These organizations should establish and implement audit policies and configurations for their systems. These policies should specify the event to be audited, the information to be collected, and the appropriate place to store the audit logs. There should also be audit trails whose purpose is to record security-related activities and events. Non-federal organizations should also store the audit logs securely to ensure no one can access them unauthorizedly, tamper with them, or delete anything from them.

Incidence Response

The NIST 800-171 publication is keen to ensure non-federal networks adhere to the incident response security guidelines. It prepares organizations to have effective responses to arising security threats. The publication requires organizations to create a well-organized incident response plan outlining how it is supposed to handle security incidents as they arise. It should have procedures for detecting, reporting, and responding to security threats. 

Constant Monitoring

The NIST 800-171 special publication requires that these networks conduct continuous monitoring to ascertain the safety of CUI. Continuous monitoring is about organizations having ongoing surveillance and assessing their security posture to determine and respond to possible security threats. The process involves risk assessment, where organizations must identify and assess potential vulnerabilities and risks to help prioritize monitoring efforts.

Also, organizations should constantly monitor security controls to ascertain their reliability in CUI protection. It involves monitoring access controls, detecting intrusions, and monitoring encryption. The organizations should also have ways to gather threat intelligence to stay informed on any vulnerabilities. They should be able to scan the systems using different scanning tools to help identify the weaknesses that attackers can maximize in their attempts to penetrate the systems. 

Source : SmartData Collective Read More