Salesforce, launched in 1999 by Marc Benioff, is the world’s #1 Customer Relationship Management (CRM) service provider that is designed to automate several sales processes in a company. It also offers a complementary catalogue of enterprise applications aimed at simplifying its key business verticals such as customer service, marketing, analytics, and application development. Now such a solution is bound to generate loads of data which will not be suitable for eradication. Keeping in mind the common problem of enterprises ultimately running out of their limited data storage, Salesforce came up with the innovative ‘Big Objects’ solution in 2018, which is their big-data based storage system.Big Objects allows Salesforce customers to store and manage an enormous volume of data (usually billions of records or more) within the Salesforce platform. The best feature about them is the guarantee of consistent performance whether the data is 100 millions or 1 billion. Another prominent factor is the ease of accessibility to one’s organization or external systems powered by a standard set of APIs. Broadly, Big Objects are classified into two types:
Standard Big Objects – Defined by Salesforce itself and included in its products list, Standard Big Objects are straight out of the box and are non-customizable. Field History Archive is one such Big Object which allows customers to archive their field history data of 10 years or more, aiding in compliance with industry regulations related to auditing and data retention.
Custom Big Objects – Contrary to Standard Big Objects, Custom Big Objects are defined and deployed by the customer itself, through the Metadata API or from Setup directly, to archive information unique to their organization. A Custom Big Object can be deployed using Metadata API by creating an object file; containing its definition, fields, and index; along with a permission set; and a package file. Using Setup, a Custom Big Object can be created by defining its fields and building the index that defines how the big object is queried.
Now that you have a clear idea of Big Objects, let’s unlock the mystery behind them and understand why top Salesforce customers are choosing them to transform their data archival needs.
When the amount of Salesforce data reaches the level of billions of rows, standard objects seem redundant and additional storage is required. To bypass the soaring prices of additional storage, enterprises should opt for Big Objects where all the lesser accessed historical data can be easily and conveniently archived.
Since the historical data is saved well-within the Salesforce ecosystem, without ever being exposed to any external system, organizations dealing with sensitive data can easily rely on Big Objects.
In addition to being cost-effective, Big Objects present an advantage of having a huge volume of archived historical data readily available whenever its need arises.
Unlike other systems, Big Objects eliminate or diminish the need of having to create integrated relationships in between archived historical data and then having to maintain them. This clears the way for an extremely user-friendly and streamlined experience.
Typically, most enterprises use Big Objects for fulfilling one of these three specific use cases:
360-degree view of the customer– Custom Big Objects can be deployed to integrate all the detailed customer information in a single place. Comprehensive information about various loyalty programs, feeds, clicks, billing details, provisioning information, and much more can be stored here. This can help optimize the customer relationships and get a complete customer analysis report.
Auditing and Tracking– Custom Big Objects frequently are used for auditing and tracking purposes where archived data is helpful in keeping a long-term view of the user’s Salesforce usage, mainly for analysis or compliance purposes. This feature is extremely useful in heavily regulated industries like financial, healthcare, and government sectors.
Historical Archive– The real perk that Big Objects facilitate is the historical archive feature which potentially solves the age-old Salesforce data storage problem. In this, proper access to the historical data is maintained for analysis or compliance purposes while optimizing the performance of the core CRM or Lightning Platform applications.
Since we have elaborated so much on the specific scenarios where Big Objects are helpful, its best to shed some light on some of the key benefits they provide:
Future Storage: Big Objects is essentially Salesforce’s Big Data-based storage system that stores all the important historical data on a secure native platform, guaranteeing highest levels of security & integrity. Even as the company expands and even more data is generated, these storage systems can easily adapt to that and therefore can be relied on as steady means of future storage.
Cost-effective: Big Objects offer an extremely cost-effective storage solution when compared to the additional data storage purveyed by Salesforce, that is immensely expensive. An organization can ideally save over 80% of their data storage costs by simply leveraging Big Objects.
Analytics and Prediction: Analytics can be efficiently and effortlessly used on the huge volumes of data stored in the Big Objects and essential insights can easily be narrowed down from heaps of data. This cuts back the need to implement external analytical tools and techniques, thus saving costs even more.
Async SOQL: Async SOQL is perhaps the most efficient way of processing a large volume of data in the Big Objects. Being highly scalable solutions, Big Objects use a subset of SOQL commands thereby making it simple for anyone already familiar with SOQL to use them.
Improvement of User Experience: Since Big Objects is built to process a large volume of data quickly, it ultimately reduces the load from the primary data storage. This eventually improves the overall performance of the enterprise’s CRM.
Addressing Audit and Compliance Demands: With the introduction of GDPR, it is now mandatory for businesses operating in the EU to store all customer-related data. Using Big Objects, an enterprise can retain its Salesforce data for a longer period of time adhering to the compliance purposes. They are the perfect solution for keeping the customer data within the Salesforce system itself without exposing it externally.
At this juncture, It is important to point out that Big Objects are merely a storage option and in order to use them to archive the historical data, an archiving solution is necessary. DataArchiva, the first & only NATIVE data archiving solution for Salesforce powered by Big Objects, is the one-stop solution for all your Salesforce data archival needs. By periodically archiving the historical data, DataArchiva can potentially save 85%+ of the Salesforce data storage costs. The CRM performance will never be affected by the expanding data which can be stored for years. To know more about it, please get in touch with us.Because of the scale that Big Objects operate in, they don’t work exactly like non-big objects and are bound to have some limitations of their own:
Big Objects support only object and field permissions. This is important if there is a sharing limitation around the object data based on roles.
Once a Big Object has been deployed, its index cannot be edited or deleted. To implement this change, you will have to start over with an entirely new Big Object.
You cannot define or add a field to a custom Big Object through UI. In order to do so, Metadata API has to be used which is more tricky.
Big Objects only support custom Salesforce Lightning and Visualforce components, rather than standard UI elements like home pages, detail pages, list views, and so on. This means that you cannot use Big Objects without customizing them first.
Only up to 100 Big Objects can be created per organization. The limits for Big Object fields are also similar to the limits on custom objects.
Big Objects don’t support transactions that include other Big Objects, standard objects, and custom objects. They also don’t support features like triggers, flows, processes, and the Salesforce app to manage the scale of data.
If you want to bypass these limitations of Big Objects while still wanting to archive your historical data, you can always opt for external archiving using DataConnectiva. It is an External data archiving solution for Salesforce customers that lets them archive their significant data to any compatible external database. It also saves over 90% data storage costs, improves the CRM performance, and boosts compliance. To know more about it, please get in touch with us.In general, Big Objects provide a strategy for users to archive a huge amount of data without worrying about storage capacities. They help deal with the huge amount of records and run queries designed for consistent performance at scale. It enables the Big Data capability to the Salesforce platform while ensuring a consistent and scalable experience. In short, Big Objects help Salesforce businesses and developers handle their Big Data easily.
Different Types of Salesforce Data Archiving & Their Importance
Download the eBook to learn more about different types of Salesforce data archiving, their importance, and how to implement an enterprise-grade archiving process.