Archives September 2022

Data governance building blocks on Google Cloud for financial services

Data governance building blocks on Google Cloud for financial services

Data governance includes people, processes, and technology. Together, these principles enable organizations to validate and manage across dimensions such as:

Data management, including data and pipelines lifecycle management and master data management.

Data protection, spanning data access management, data masking and encryption, along with audit and compliance.  

Data discoverability, including data cataloging, data quality assurance, and data lineage registration and administration.

Data accountability, with data user identification and policies management requirements.

While prioritizing investment in their people to achieve the desired cultural transformation and processes to increase operational effectiveness and efficiency will help enterprises, the technology pillar is the critical enabler for people to interact with data and for organizations to truly govern their data initiatives.

Financial services organizations are faced with particularly stringent data governance requirements regarding security, regulatory compliance, and general robustness. Once people are aligned and processes are defined, the challenge for technology comes to the picture: solutions should be flexible enough to complement existing governance processes and be cohesive across data assets to help make data management simpler.

In the following sections, starting with standard requirements for data governance implementations in financial services, we will cover how these correspond to Google Cloud services, open-source resources, and third-party offerings. We will share an architecture capable of supporting the entire data lifecycle, based on our experience implementing data governance solutions with world-class financial services organizations.

Data management

Looking first at the data management dimension, we have compiled some of the most common requirements, along with the relevant Google Cloud services and capabilities from the technology perspective.

Data Management Requirements

Services & Capabilities

Data and pipelines lifecycle management

Batch ingestions: Data pipelines management, scheduling, and data pipelines processing logging

Streaming Pipelines: Metadata

Data lifecycle management

Operational metadata including both state and statistical metadata

A comprehensive end-to-end data platform

GCS Object Lifecycle

BigQuery data lifecycle

Data Fusionpipeline lifecycle management, orchestration, coordination, and metadata management

Dataplex Intelligent automation data lifecycle management

Cloud Logging, Cloud Monitoring 

Informatica Axon Data Governance

Compliance

Facilitate regulatory compliance requirements

Easily expandable to help comply with CCPA, HIPAA, PCI, SOX, and GDPR, through security controls implementation using IAM, CMEKs, BQ column-level access control, BQ Table ACL, Data Masking, Authorized views, DLP PII data 

Identification, and Policy tags

DCAM data and analytics assessment framework

CDMC best practice assessment and certification

Master Data Management

Duplicate Suspect Processing rules

Solution and department scope

Enterprise Knowledge Graph

KG Entity Resolution/reconciliation and Financial Crime Record matching MDM + ML

Tamr Cloud-Native Master Data Management

Site Reliability

Data Pipelines SLA

Data at Rest SLA

SLAs applied to data pipeline

SLAs applied to services managing data

DR strategies for data

Registering, creating, and scheduling data pipelines is a recurring challenge that organizations face. Similarly, data lifecycle management is a key part of a comprehensive data governance strategy.

This is where Google Cloud can help, offering multiple data processing engines and data storage options tailored for each need, but that are integrated and make orchestration and cataloging easy.

Data protection

Financial organizations demand world-class data protection services and capabilities to support their defined internal processes and help meet regulatory compliance requirements.

Data Protection Requirements

Services & Capabilities

Data Access Management

Definition of access policies

Multi-cloud approval workflow integration*

Access Approvals

 IAM and ACL, fine grained GCS Access

 Row-Level, Column-Level permissions

BigQuery Security

Hierarchical resources & policies

Users, Authentication, Security (2FA), Authorization

Resources, Separation boundaries, Organization policies, Billing and quota, Networking, Monitoring

Event Threat Detection

Multi-cloud Approval workflow by 3rd Party – Collibra*

Data Audit & Compliance

Operational metadata logs capture

Failing process alerting and root cause identification

Cloud Audit Logs

Security Command Center

Access Transparency & Access Approval

StackDriver Logging

Collibra Audit Logging

Security Health

Data vulnerabilities identification

Security health checks

Security Health Analytics

Security Health Analytics

Data Masking and Encryption

Storage-level encryption metadata

Application-level encryption metadata

PII data identification and tagging

Encryption at rest, Encryption in transit, KMS

Cloud DLP Transformations, De-identification

Access management, along with data and pipeline audit, is a common requirement that should be managed across the board for all data assets. These security requirements are usually supported by security health checks and automatic remediation processes.

Specifically on data protection, capabilities like data masking, data encryption, or PII data management should be available as an integral part for processing pipelines, and be defined and managed as policies.

Data discoverability

Data describes what an organization does, how it relates to its users, competitors, and regulatory institutions. This is why data discoverability capabilities are crucial for financial organizations.

Data Discoverability Requirements

Services & Capabilities

Data Cataloging

Data catalog storage

Metadata tags association with fields

Data classification metadata registration

Schema Versions control

Schema definition before data loading

Data Catalog

Column level tags

Dataplex logical aggregations (Lakes,  Zones and Assets)

DLP

Collibra Catalog

Collibra Asset versioncontrol

Collibra Asset Type creation and Asset pre-registration

Alation Data Catalog

Informatica Enterprise Data Catalog

Data Quality

On ingestion data quality rules definition (like regex validations for each column)

Issues remediation lifecycle management

BigQuery DQ

Dataplex

Data quality with Dataprep

Collibra DQ

Alation Data Quality

CloudDQ declarative Data Quality validation (CLI)*

Informatica Data Quality

Data Lineage

Storage and Attribute level Data Lineage

Multi-cloud/on-premises  lineage

Cloud Data Fusion Data Lineage

Understand the flow

Granular visibility into flow of data

Operational View

Openess or share lineage

Data Catalog & BigQuery

Collibra lineage

multi-cloud/on-premises management

Alation Data Lineage

Data Classification

Data Discovery and Data Classification metadata registration 

DLP Discovery and classification

90+ built-in classifiers: Including PII

Custom classifiers

A data catalog is the foundation on which a large part of a data governance strategy is built. You need automatic classification options and data lineage registration and administration capabilities to make data discoverable. Dataplex is a fully managed data discovery and metadata management service that offers unified data discovery of all data assets, spread across multiple storage targets. Dataplex empowers users to annotate business metadata, providing necessary data governance foundation within Google Cloud, and providing metadata that can be integrated later with external metadata by a multi-cloud or enterprise-level catalog. The Collibra Catalog is an example of an enterprise data catalog on Google Cloud that complements Dataplex by providing enterprise functionality such as an operating model that includes the business and logical layer of governance, federation and the ability to catalog across multi-cloud and on-premises environments.

Data quality assurance and automation is the second foundation of data discoverability. To help with that effort Dataprep is another tool for assessing, remediating, and validating processes, and can be used in conjunction with customized data quality libraries like Cloud Data Quality Engine, a declarative and scalable data quality validation command-line Interface. Collibra DQ is another data quality assurance tool, and uses machine learning to identify data quality issues, recommend data quality rules and allow for enhanced discoverability.

Data accountability

Identifying data owners, controllers, stewards, or users, and effectively managing the related metadata, provides organizations with a way to ensure trusted and secure use of the data. Here we have the most commonly identified data accountability requirements and some tools and services you can use to meet them.

Data Accountability Requirements

Services & Capabilities

Data User Identification

Data owner and dataset linked registration

Data steward and dataset linked registration

Users role based data usage logging 

Dataplex

Data Catalog

Analytics Hub 

Collibra Data Stewardship

Alation Data Stewardship

Policies Management 

Domain based policies management

Column level policies management

Cloud DLP

Dataplex

Policy Tags 

BigQuery Column LevelSecurity 

Collibra Policy Management

Domain Based Accountability

Governed data sharing

IAM and ACL role based access

Analytics Hub

Having a centralized identity and access management solution across the data landscape is a key accelerator to defining a data security strategy. Core capabilities should include user identification, role- and domain-based access policy management, and a policy-managed data access authorization workflows.

Data governance building blocks to meet industry standards 

Given these capabilities, we provide a reference architecture for a multi-cloud and centralized governance environment that enables a financial services organization to meet its requirements. While here we focus on the technology pillar of data governance, it is essential that people and processes are also aligned and well-defined.

The following architecture does not intend to cover each and every requirement presented above, but provides core building blocks for data governance implementation to meet industry standards as far as the technology pillar is concerned at the time of writing this blog.

1. Data cataloging is a central piece in any data governance technology journey. Finance enterprises often need to deal with several storage systems residing in multiple cloud providers and also on-premises. As such, an enterprise-level catalog, a “catalog of catalogs”, that centralizes and makes discoverable all the data assets in the organization, is a helpful capability to helping the business get the most from its data, wherever it sits.

Even when Google Data Catalog supports non-Google Cloud data assets through open-source connectors, a third-party cataloging solution (such as Collibra) may be well-suited to help with this, providing connection capabilities to several storage systems and additional layers of metadata administration. For example, this could enable having the ability to pre-register data assets even before they are available in storage, and to integrate those once actual tables or filesets are created, including schema evolution tracking.

2. From a Google cloud perspective, data to be discovered, cataloged, or protected can reside in a data lake or a landing zone in Cloud Storage, an enterprise data warehouse in BigQuery, a high-throughput low-latency datastore like BigTable, or even in relational or NoSQL databases supported by Spanner, CloudSQL or Firestore, for example.  

Gathering Cloud Data Catalog metadata such as tags is a multi-step process. Financial enterprises should standardize and automate as much as possible to have reliable and complete metadata. To populate the Data Catalog with labels, the Cloud Data Loss Prevention API (DLP) is a key player. DLP inspection templates and inspection jobs can be used to standardize tagging, sampling, and discovering data, and finally to tag tables and filesets. 

Security and access control is another big concern for finance organizations given the sensitivity of the data they handle. Several encryption and masking layers are usually applied to the data. In these scenarios, sampling and reading data to determine which labels to add is a slightly more complex process, requiring decryption along the way.

In order to be able to do things like apply column-level policy tags to BigQuery, the DLP inspection job findings need to be published to an intermediate storage location accessible to a tagging job using Cloud Data Catalog. In these contexts, a Dataflow job could help handle the required decryption and tagging. There is a step by step community tutorial on that here.

Ensuring the right people accessing the right data across numerous datasets can be challenging. Policy Taxonomy tags, in conjunction with IAM access management, covers that need.

Google Cloud’s Dataplex service (discussed more below) will also help to automate data discovery and classification using dynamic schema detection, such that metadata can be automatically registered in a Dataproc Metastore or in BigQuery before finally being used by Data Catalog.

3. To understand the origin, movement, and transformation of data over time, data lineage systems are fundamental. These allow users to store and access lineage records and provide reliable traceability to identify data pipeline errors. Given the large volume of data in a finance enterprise data warehouse environment, an automated data lineage recording system can simplify data governance for users.

Finance organizations have to meet compliance and auditability standards, enforce access policies, and perform root cause analysis on poor data or failing pipelines. To do that, Cloud Data Catalog Lineage and Cloud Data Fusion Lineage provide traceability capabilities that can help.

4. Dataplex is a fundamental part of Google Cloud’s vision for data governance. Dataplex is an intelligent data fabric that unifies and automates data management and allows easy and graphical control for analytics processing jobs. This helps financial organizations meet the complex requirements for data and pipeline lifecycle management.

Dataplex also provides a way to organize data into logical aggregations called lakes, zones and assets. Assets are directly related to Cloud Storage files or tables in BigQuery. Those assets are logically grouped into zones. Zones can be typical data lake implementation zones like raw zones, refined zones, or analytics zones, or can be based on business domains like sales or finance. On top of that logical organization, users can define security policies across your data assets, including granular access control. This way, data owners can grant permissions while data managers can monitor and audit the access granted.

Build a data governance strategy in the cloud

For financial data governance implementations to have trust in their data, and meet regulatory compliance requirements, they must have a solid and flexible technology pillar from which to build processes and align people. Google Cloud can help build that comprehensive data governance strategy, while allowing you to add third-party capabilities to meet specific industry needs.

To learn more: 

Listen to this podcast with Googlers Jessi Ashdown and Uri Gilad

See how Dataplex and Data catalog can become key pieces in your data governance strategy

Meet the authors of Data Governance – The Definitive Guide 

Review the principles and best practices for data governance in the cloud in this white paper.

Related Article

Data governance in the cloud – part 1 – People and processes

The role of data governance, why it’s important, and processes that need to be implemented to run an effective data governance program

Read Article

Source : Data Analytics Read More

Creative Ways to Leverage Big Data for an Optimal Marketing Plan

Creative Ways to Leverage Big Data for an Optimal Marketing Plan

Big data technology is becoming more important than ever for modern business owners. One study by the McKinsey Institute shows that data-driven organizations are 19 times more likely to be profitable.

There are many benefits of using big data to run a business. One of the most important advantages is that big data can help with marketing.

Big Data is Essential for Modern Marketing Strategies

Running a business isn’t easy, especially when it comes to marketing. However, if you want to continue to draw in new customers and clients, continuous marketing is a must. The good news is that big data can help with this. The McKinsey Institute report showed that data-driven businesses are 23 times more likely to acquire customers.

The good news is there are ways to use big data to simplify and boost your efforts to guarantee success. If you are looking to boost your marketing efforts as a data-driven organization, then you should follow these crucial tips.

Big data has revolutionized marketing. Giving you insights into how your current methods are working, your customers, and increasing brand awareness, big data can play a crucial role in your success.

The main types of big data you’ll want to capture include:

Customer dataOperational dataFinancial data

By collecting and analysing customer data, you’ll get a much better idea of who your target audience is. This can help you to figure out the best places to advertise and market your services, as well as determine your brands tone of voice. Having a strong understanding of your target audience is crucial in marketing. After all, if you don’t know who you are marketing to, how can you expect to receive results?

Operational data refers to the way the business runs, including shipping and logistics, and customer relationship management. Data has become very important for improving customer service. When you have a clear picture of the way the business is run, improvements can be made to improve performance. This in turn will boost customer satisfaction, leading to more word-of-mouth referrals.

Financial data such as pricing, sales, and margins, helps you to budget more effectively. You will also see where your budget is being wasted, allowing you to switch to more profitable marketing methods.

The more data you collect and analyse, the more targeted and effective your marketing will become.

Update Your Aesthetics – how things like flooring help with marketing (makes good impression etc)

In business, it’s important to make a great first impression. This is difficult to do if the aesthetics of your brand aren’t on point.

Start with your digital aesthetics such as your logo, web site, and social media presence. Does your branding match your business? Having clean, clear aesthetics can help you to appear more authoritative and professional.

It isn’t just your digital presence that you need to worry about. How your physical premises are laid out will also make a difference to your marketing efforts. Firstly, it determines a customer or client’s opinion of the business if they visit your premises. Secondly, the aesthetics of your business can impact morale, motivation, and productivity.

Everything from the type of flooring you have installed to how much light enters the premises can make a difference. When it comes to the flooring of your business it should be comfortable, practical, and aesthetically pleasing. It doesn’t have to cost a fortune to update the flooring in your business. There are companies that offer up to 65% off commercial flooring.

These are just some of the ways aesthetics matter in business. If you want to make a good impression, start by giving your online or offline a presence a makeover. 

Leverage big data for local community engagement

Giving back to your local community is a great way to boost your marketing efforts. Customers and clients generally love brands who use their profits for good.

It could be sponsoring a local sports team, organizing a charity fundraiser, or planting trees and greenery to help improve air quality and aesthetics. Don’t forget to advertise the ways you give back to the community on your social media platforms. Getting involved in your local community could help you to attract a lot of new customers, as well as keep existing ones coming back for more.

It might seem like big data wouldn’t help much with local community engagement. However, there are creative ways to tap data to learn more about your target consumers. This allows you to focus on identifying charities and engagement opportunities that allow you to be seen be your target customers.

Use Big Data for Reputation Management

You need to use data mining to improve reptation management. You can use data scraper tools to find positive statements customers and experts have made about your company. Then, you can showcase these testimonials on your website.

Do you have glowing testimonials you can show off to potential clients? These days consumers need to trust a business before they buy from them. Testimonials and positive reviews can help to put their mind at ease, making them more likely to make a purchase.

You should showcase your testimonials wherever you can, including on your website, social media pages, and in email signatures. Don’t forget to encourage your customers to leave them too. Having a constant stream of positive reviews will do wonders for your brand.

Use Data-Driven SEO

To continuously attract new clients and customers, you need to work on your SEO. Making it easier for you to be found by search engines, the right SEO tactics can boost website traffic, convert more leads into customers, and improve your bottom line.

It can take a lot of work to develop and implement a successful SEO strategy. If you need to, bring in the professionals. SEO companies and freelancers can help you to achieve better rankings with minimal effort on your part.

There are a lot of benefits of using big data in SEO. You can use data-mining tools to identify keywords that are likely to appeal to your target demographic. There are also data-mining tools that can help you identify links to competitor websites, so you can reverse engineer their linkbuilding strategies.

Offer competitions and giveaways

Giveaways and competitions tend to attract a lot of attention. If you are trying to bolster your marketing efforts, think of a giveaway or competition that your audience will love.

Advertise your offer on social media, asking participants to like, share, and comment on your post. This will boost its visibility to others, making it easier for customers to find you. Limited time giveaways and competitions work best, and you can offer everything from discounts to free products.

Make the most of social media

Social media provides a ton of opportunities for marketing your brand. However, it’s important to focus on just one platform at a time when you are just getting started.

Find out where your ideal audience hangs out, then focus on marketing your business on that channel. With social media you can run paid ads, post valuable content, gather fans and followers, and boost brand awareness.

There are over a billion people using social media sites, giving you access to a huge audience. If your business doesn’t yet have a strong social media presence, now is the time to build one up.

You can’t take an ad hoc approach to social media, though. You are going to need to invest in social media analytics tools that will help you make more nuanced insights. You can use your data to guide your decision-making process, so yo can create the best content, post at the right times and engage with the right networks.

Use Big Data to Think Outside of the Box

Like everything in business, the best results often come from thinking outside of the box. Big data technology will make this a lot easier. You need to use analytics tools to make observations that you can use to make more informed decisions. When you invest in big data, you can come up with innovative ways to market your business. Look at what your competitors are doing and identify ways to improve on their strategies.

There are tons of ways to boost your marketing efforts with big data technology. The above methods are some of the most effective things you can try out to start seeing bigger, better results. Consistency and continually tracking your efforts with data analytics tools are key to your success.

The post Creative Ways to Leverage Big Data for an Optimal Marketing Plan appeared first on SmartData Collective.

Source : SmartData Collective Read More

Expanding the Google Cloud Ready – Sustainability initiative with 12 new partners

Expanding the Google Cloud Ready – Sustainability initiative with 12 new partners

We introduced the Google Cloud Ready – Sustainability designation earlier this year to showcase those partners committed to help global businesses and governments accelerate their sustainability programs. These partners build solutions that enhance the capabilities and ease the adoption of powerful Google Cloud technologies, such as Google Earth Engine and BigQuery, allowing customers to leverage data-rich solutions that help reduce their carbon footprints.

Today, we’re pleased to announce growth of the Google Cloud Ready – Sustainability program, with 12 new partners joining the initiative and bringing their climate, ESG, and sustainability platforms to Google Cloud. These partners include: 

Aclima is pioneering an entirely new way to diagnose the health of our air and track climate-changing pollution. Powered by its network of roving and stationary sensors, Aclima measures air pollution and greenhouse gasses at unprecedented scales and with block-by-block resolution.

Sustainability at Airbus means uniting and safeguarding the world in a safe, ethical, and socially and environmentally responsible way. Airbus has a comprehensive sustainability strategy built on four core commitments, which guide the company’s approach to the way it does business and how it designs its products and services: Lead the journey toward clean aerospace, respect human rights and foster inclusion, build the business on the foundation of safety and quality, and exemplify business integrity.

Atlas AI is a predictive analytics platform that analyzes, monitors, and forecasts regions of growth, vulnerability, and opportunity around the world to offer insight into where organizations can grow most successfully, and where investment can boost historically underserved communities. Atlas AI’s platform has been used to expand water and sanitation infrastructure, promote new electrification, target community health services, and broaden internet access in countries across Sub-Saharan Africa and South Asia.

BlueSky Resources makes sense of sensors from both public and private sources by harmonizing inputs from ground, aerial and space based inputs.  Expertise in atmospheric science and cloud technology allows BlueSky to provide understanding and insights related to the correlation of emissions insights to assets and activities.   This powerful combination of data, climate science and delivery of insights is enabling focused sustainability impact across clients in various industries including energy, waste management, industry and natural resource management.

Electricity Maps provides companies with actionable data quantifying the carbon intensity and origin of electricity. This data is available on an hourly basis across 50+ countries and more than 160 regions. Electricity Maps’ mission is to organize the world’s electricity data to drive the transition toward a truly decarbonized electricity system.

FlexiDAO is a global climate tech company based in the Netherlands and Spain. The company works closely with other critical stakeholders to co-create the international standard around energy-related emissions compliance. Thanks to FlexiDAO’s end-to-end 24/7 Carbon-free Energy platform, companies can quantify and confidently showcase their contribution to society’s decarbonization. 

LevelTen Energy helps organizations achieve carbon-free energy usage targets (on an annual and 24/7 basis) by delivering access to the world’s largest clean energy marketplace, and the software, data, analytics, and expertise required for efficient transactions. The LevelTen Platform connects energy buyers and over 40 sustainability advisors with more than 1,800 carbon-free energy projects in 24 countries across North America and Europe.

Ren is a SaaS platform built on Google Cloud that enables companies with global supply chains to source the cleanest energy possible. Despite using country-sized amounts of energy, most companies have no idea how to transition to renewables due to complex financial, technical, and logistical challenges. Ren unlocks cost savings, provides the cleanest energy possible, and ensures companies meet their carbon commitments on time.

Sidewalk Labs, an urban innovation unit in Google, builds products to radically improve quality of life in cities for all. Delve is a product that helps real estate teams design more sustainable buildings and neighborhood blocks, faster. Mesaautomates building controls to deliver savings and comfort to commercial building owners and tenants. With these products and others, Sidewalk Labs helps commercial real estate developers, building owners and city planners make more sustainable choices for the built environment that are better for communities and the planet. 

Tomorrow.io is The World’s Weather and Climate Security Platform, helping countries, businesses, and individuals manage their weather and climate security challenges. The platform is fully customizable to any industry impacted by the weather. Customers around the world use Tomorrow.io to dramatically improve operational efficiency. Tomorrow.io was built from the ground up to help teams prepare for the business impact of weather by automating decision-making and enabling climate adaptation at scale. 

UP42 is a geospatial developer platform and marketplace bringing together industry-leading data and ready-to-use processing algorithms. The platform enables organizations to build, run, and scale geospatial products. With the ability to choose from a wide range of high-resolution commercial and open satellite data, aerial, weather, and others, solution providers can apply best-in-class machine learning and/or processing modules to gain valuable geospatial insights and streamline their processes.

Woza is a sustainable innovation platform that leverages deep geospatial knowledge and existing best-in-class technologies to develop a new generation of streamlined analytics workflows focused on sustainability. Companies in agri-food, energy, and public sector are partnering with Woza to accelerate their journey to Industry 4.0.

Adding expertise to accelerate sustainability use cases

New partners in the initiative join our existing Google Cloud Ready – Sustainability partners like Carto, Climate Engine, Geotab, NGIS, and Planet Labs PBC bringing a wealth of industry knowledge, offering solutions for sustainability challenges ranging from first-mile sustainable sourcing and spatial finance to fleet electrification and rich geospatial visualizations. 

CARTO is the world’s leading Location Intelligence platform, enabling organizations to use spatial data and analysis for more efficient delivery routes, better behavioral marketing, strategic store placements, and much more. The company’s solutions extend the geospatial capabilities available in BigQuery, while leveraging the near limitless scalability that Google Cloud provides. When it comes to sustainability, CARTO’s platform is trusted by a wide range of organizations, including Greenpeace, Vizzuality, Litterati, Indigo, WWF, the Marine Conservation Institute, The World Bank, and the Institute for Sustainable Cities.

Climate Engine leverages data from Google Earth Engine and other ecosystem partners to help organizations improve their climate change-related risk planning in areas such as water use, agriculture, storm risk, and wildfire spread. By linking the economy and the environment, organizations can understand how environmental risks are affecting their markets and discover opportunities to reduce their emissions and potential supply chain or operational disruptions from climate-related events. 

Geotab is advancing security, connecting commercial vehicles to the cloud, and providing data-driven analytics to help customers better manage their fleets. Processing billions of data points daily, Geotab helps businesses improve and optimize fleet productivity, enhance safety, and achieve sustainability goals and stronger compliance.

Geospatial solutions provider NGIS built a SaaS-based first-mile sustainable sourcing solution called TraceMark using Google Cloud’s geospatial platform and technologies from other ecosystem partners. Several global CPG firms have already used TraceMark to modernize their geospatial workflows and help facilitate the use of space-based data for supply chain sustainability transformation. 

Planet Labs PBC operates the largest fleet of Earth imaging satellites in history, with approximately 200 satellites in orbit. Planet’s mission is to image the whole earth’s landmass every day to make global change visible, accessible, and actionable. As a Public Benefit Corporation, Planet’s Public Benefit Purpose is to accelerate humanity toward a more sustainable, secure, and prosperous world by illuminating environmental and social change. 

How the Google Cloud Ready – Sustainability program works

If you are a Google Cloud partner with sustainability solutions and expertise to share, the  Google Cloud Ready – Sustainability program is open for applications. Entry into the program requires that the partner solution delivers quantifiable results for climate mitigation, adaptation, or reporting needs. To apply for the Google Cloud Ready – Sustainability designation, the solution must: 

Be available on Google Cloud

Address ESG risk, and assist customers in achieving ESG targets and/or support typical ESG goal frameworks, such as the United Nations’ SDGs

Demonstrate repeatability

Meet minimum Google Cloud application development best practices, including security, performance, scalability, availability, and carbon footprint reporting for available services

Have Google Cloud Carbon Footprint Reporting enabled

Have at least one public customer case study available. 

The selection process begins with an evaluation of the solution. If a partner meets the above criteria, Google Cloud provides a suggested roadmap for tier progression within the program and then issues a formal acknowledgement of participation in the Google Cloud Ready – Sustainability program. Together, Google Cloud sustainability partners can deliver platforms that are helping businesses and governments accelerate progress aligned to their environmental goals. 

Google Cloud will showcase the validated solutions on the Google Cloud Partner Directory Listing, Google Cloud Ready Sustainability Partner Advantage page, and — if applicable — via the Google Cloud Marketplace. We hope to help customers better understand how these technologies can help them meet their ESG goals, find the right solution for their particular challenge, and implement a solution faster. 

Prospective partners can visit the Partner Portal to learn more about the Google Cloud Ready – Sustainability program or complete an application.

Related Article

Google Cloud announces new products, partners and programs to accelerate sustainable transformations

In advance of the Google Cloud Sustainability Summit, we announced new programs and tools to help drive sustainable digital transformation.

Read Article

Source : Data Analytics Read More