Categories
Blogs

Why Smaller Firms Are Getting Hammered: Myths of Scale

Why Smaller Firms Are Getting Hammered: Myths of Scale

There is a bit of folklore in most heavily regulated industries, and especially in the financial services industry, that goes something like this: Larger players should worry the most about regulatory compliance and security.
The reasoning is simple: Larger, well-known firms are the ones most likely to be targeted by both cyber criminals and government agencies. Smaller firms—even moderately sized ones—will tend to “fly under the radar” of both, and so can put off investing in technology (RegTech) or training until the time when they have grown big enough to attract attention. In short, worrying about compliance or cybersecurity is a matter of scale, but only in the roughest sense: There are two size classes, bigger firms that need to worry about these things, and everyone else.
The term “folklore” is apt here because this kind of thinking never is written explicitly, but it is assumed by many. And while it might have applied in the past, it is surely not the case now—which means that too many regulated firms are toiling under a false, and risky, assumption.
With Automation, Everyone is On the Radar: The Ransomware Example

So what has changed?

Let’s look at the logic with a specific example: Ransomware gangs that try to gain a foothold in a business to take a chunk of the firm’s data hostage.
Not too long ago, gaining access to critical business systems took some time and diligence on the part of the hackers. They had to either undo the company’s security and encryption, or else dupe an employee into giving up their credentials (easier to do, the more employees there are). Because an attack took time and effort, it made sense for hackers to go “big game hunting”—that is, to try to get the best bang-for-their buck by targeting larger firms with bigger cash flows. That is where the best payoff would be.
What has changed since those days is automation. A ransomware gang can now target hundreds of firms of various sizes, all at the same time, looking for vulnerabilities and beginning spear-phishing attacks to gain system access. They can then focus their energies on those that prove vulnerable, even if the payoff is much less for any one successful attempt.
And which firms tend to be the most vulnerable? It is exactly the small-to-medium-sized firms, because they have bought into the folklore that says hackers won’t bother targeting them. Having bought into the folklore, they don’t take the necessary steps to protect themselves.
Think of it as a contrast between a cat burglar and a gang of street thieves: The cat burglar spends his time trying to pick the lock on a single door, hoping there is a stash behind it. But what the gang of thieves lack in skill and finesse, they more than make up for in manpower: They simply try every door, hoping that, eventually, one will be unlocked. The unlocked rooms might not be as lucrative, but they are also much less likely to have adequate security measures in place, too. Today’s hackers are no longer cat burglars, they are gangs looking for easy scores—and smaller firms are exactly that.
Regulatory Compliance is Playing the Same Game
Ransomware is just one example of a risk to which firms of all sizes are now exposed. A similar logic now applies to regulatory compliance, too.
Government institutions, for a long time, went after bigger firms, believing they would be the most egregious offenders when it came to compliance. Smaller firms would not attract much scrutiny, unless something was directly brought to the attention of regulators.
This is no longer the case, and again, automation is part of the story. For example, government firms are now using automation and artificial intelligence to “find the needle of market abuse in the haystack of transaction data,” using various algorithms to scrape the web for deceptive advertising and capturing red flags that might indicate wrongdoing. They are also using these tools to zero in on accounting and disclosure violations. Regulators can now spot potential problems more quickly and quietly than ever before, and now more small firms are getting MRA letters from regulators, surprised that they are no longer invisible.
This is an importantly different phenomenon from regulatory tiering. It has always been the case that many regulations carve out exceptions for smaller businesses, when strict compliance would be an undue burden on them. For example, health insurance mandates and employment laws have clauses that exclude firms of a particular size. While it can be debated how and when such tiering should occur, the fact is that many businesses fall under the more regulated tier by law, but have traditionally escaped scrutiny because they were “small enough.” Those days are now over.
Beware, Data Scales Quickly
Part of the issue for financial services firms is not only the sheer amount of data they generate, but the kinds of data they generate.
The volume of data generated correlates pretty well with the size of a firm. This makes sense: The larger the firm, the larger the customer base, and the more transactions happen every day.
But the compliance nightmare comes more from the huge variety of data generated by financial services firms, and that variety does not scale: It’s huge, whether you are a small local firm or a large international one. For example, on top of transactional data, a financial services firm might have
  • Client personal data (name, address, birthday, Social Security number, etc.)
  • Credit information
  • Mortgage and loan information
  • Contract information
  • Email and other logged communications
  • Employee personal information and pay information
  • Analytics data (based on customer spending patterns, segments, new products, customer feedback, etc.)
  • Marketing data (email open rates, website visits, direct mail sent, cost of obtaining a new customer, etc.)
…and much more. That data often resides on different servers and within an array of applications, often in different departments.
This means that, when it comes to complying with data privacy laws, or protecting data with the right cybersecurity measures, size doesn’t matter. The variety of data is a problem for firms of all sizes.
Moral of the Story: Smaller Firms Need Protection, Too. Yes, You.
The folklore says that smaller regulated firms can put off investment in cybersecurity and RegTech simply because cyber threats and regulatory scrutiny will “pass over” smaller firms and land, instead, on the bigger players.
That is no longer the case. Both cyber criminals and government regulators are using tools to spot problems more quickly and easily, and it is worth their while to set those tools to investigate everyone. (We’ll let readers decide which they would rather be spotted by first.) Indeed, small- and medium-sized firms are having a more difficult time now, because it is much less common for these firms to have proactively invested in preventive solutions.
So what do you do if you are a smaller company in a heavily regulated industry? The first step would be to look into technology that can give you the most protection for your dollar. After all, if cybercriminals and government agencies are going to use advanced digital tools, you should too. Having an immutable data archive, automated compliance workflows, and application retirement tools are all a good beginning.
The alternative would be to do nothing, and hope that your turn will not come up. But strategies based on folklore have never been very good at reducing risks—quite the contrary.

Share News

Categories
Blogs

Legacy Data is Valuable, but Legacy Applications Are a Risk. What’s a Finserv IT Department to Do?

Legacy Data is Valuable, but Legacy Applications Are a Risk.
What’s a Finserv IT Department to Do?

In January 2022, infosec and tech news sources blew up with stories about “Elephant Beetle,” a criminal organization that was siphoning millions of dollars from financial institutions and likely had been doing so for over four years.
This group is sophisticated. It is patient. But the truly frightening part of the story was that the group was not using genius-level hacking techniques on modern technology. On the contrary: Their MO is to exploit known flaws in legacy Java applications running on Linux-based machines and web servers. Doing this carefully, the group has been able to create fraudulent transactions, thereby stealing small amounts of money over long periods. Taken altogether, the group managed to siphon millions unnoticed.
What this story reveals is that legacy applications are a huge liability to financial institutions. They have known vulnerabilities that, too often, go unpatched, and they can be exploited without raising an alarm.
Even if they are not breached by bad actors, such systems can be a drag on institutional resources. According to research done by Ripple back in 2016, it was estimated that maintaining legacy systems accounted for some 78% of the IT spending of banks and similar financial institutions.
The question is: Why are financial institutions holding onto these legacy applications? And what can be done to minimize the risk they pose?
The Need to Retain Legacy Data
Most companies have at least one legacy application they still support simply because they need its data. In some cases, that data has to be held to fulfill retention requirements for operational or compliance purposes. In other cases, that data holds important business insights that can be unearthed and used profitably. And in some cases, it’s a mix of both.
But with the march of technological progress, it’s easy to see how this situation can get out of hand. Some organizations have hundreds of legacy applications they’ve accumulated over time just to keep easy access to the data stored within them. The vast majority of those applications are no longer being fed live data, having been replaced by next-generation solutions. But they sit on the organization’s network just the same, taking up resources and allowing bad actors to have a back door to other more mission-critical applications.
Because these applications are no longer in use or are being updated with new data, it does not make sense to patch and maintain them continually. (And in most cases, patching is not even an option as no one is supporting the legacy application anymore.) To really make them safe, they should instead be retired. Application retirement, done correctly, can render an application inert while still providing select users access to its data.
Application Retirement is Now a Part of Finserv IT Best Practices
Application retirement is the process used to decommission applications (and any supporting hardware or software) while securely keeping their data accessible to maintain business continuity. As applications grow, develop, and change, the number of legacy applications that need to be carefully retired grows exponentially.
An application retirement program is simply a strategy, including a schedule and a set of rules, for ensuring that application retirement happens correctly. Any application retirement program must ensure that the right data is retained and in the correct business context so that the data remains meaningful to business users (or auditors) long after the legacy application is decommissioned.
More and more businesses are looking for increased sophistication when it comes to such programs because they provide:
  • Direct savings by eliminating legacy support and maintenance costs (which, again, can account for 78% of financial institutions’ IT budget).
  • Efficiency gains by delivering effortless access to historical business data.
  • Regulatory compliance through application retention rules that manage data securely throughout its lifecycle.
That said, the most challenging part of a retirement program often is getting it off the ground. Getting leadership buy-in, determining retirement criteria, and classifying all of the applications in the ecosystem can produce hurdles to effective execution.
And even when those hurdles are cleared, the process itself takes a great degree of technical expertise to execute. System analysis; data extraction, processing, and storage; user access design—all of these need to happen seamlessly before the application itself can be successfully retired.
Getting the Expertise on Board for a Successful Application Retirement Program
We’ve only scratched the surface here regarding best practices around application retirement. Those interested in learning more about the application retirement process and the best practices around a robust application retirement program are encouraged to download our white paper: Application Retirement: The Practical Guide for Legacy Application Maintenance.
But even with this quick glimpse, two things are already clear. One is that it takes a good deal of technical know-how, as well as best-in-class tools and technology, to successfully retire applications while still making the critical data within them accessible.
The other is that relying on the status quo is simply not a viable option. While the organizational and technical hurdles to proper application retirement are substantial, their cost is far outweighed by the cost of doing nothing—which includes not only wasting valuable IT department time and budget but also leaving the back door open to bad actors.
Interested in tools that make application retirement easier? Ask us about Infobelt’s Application Retirement Workbench, available with Omni Archive Manager.

Share News

Categories
Blogs

Archiving vs. Backup: Which Do You Really Need for Compliance?

Archiving vs. Backup: Which Do You Really Need for Compliance?

Everyone who has worked in records management has seen it before: Organizations keeping their backup copies of production data “because it’s needed for compliance.” This, however, turns out to be a costly move…and one that does not really address data retention needs. What is really needed for data retention is a proper data archiving system.
Which prompts the question: What is the difference? Why is backup not suitable for compliance, and what is gained from investing in a true enterprise data archive?
Archiving vs. Backup: Two Different Missions
The short answer to the above is that archiving solutions and backup solutions were created with two different goals in mind:
  • Backup makes a copy of data (both active and inactive) so that, should that data become damaged, corrupted, or missing, it can be recovered quickly.
  • Archiving makes a copy of inactive or historical data so it can be stored in an unalterable, cost-effective way for legal or compliance reasons.
Backup is an important part of a business continuity plan. Should a piece of hardware fail, or a database become corrupted, it still will be possible to recover the necessary data to keep business operations going.
Maintaining a backup system can be costly, however. The data in the system needs to be updated often, and made easily recoverable, should a disaster happen. The space and cost required to do so can become quite large as an organization’s data grows.
Archiving stems from the realization that not all data an organization has is needed for daily operations—it is not production data. Examples include old forms, transaction records, old email communications, closed accounts, and other historical data. But while this data has no ongoing use, it has to be kept to comply with laws having to do with data retention.
It’s easy to see how the two might be confused—after all, both kinds of technology are, in essence, making a copy of the organization’s data.
But whenever you have two different goals or purposes for two different pieces of technology, you are going to have some important differences as well. If those differences are large enough, you won’t be able to simply swap one technology for the other. At least, not without some major problems.
First Major Difference: The Cost of Space
When a bit of data is stored, there is a cost associated with it. That’s true whether that data sits in the cloud, on an on-prem server, or on a tape drive in a closet somewhere.
Not all storage costs are equal. Take cloud providers like AWS, Microsoft (Azure), and Google, for example. These big players tier their storage offerings, basing the price on things like accessibility, security, and optimization for computations. “Hot storage” holds data that might be used day-to-day and needs to be optimized for computing, and so is relatively much more expensive. “Cool” or “cold” storage is for data that is rarely used, and so does not need to be optimized or accessed quickly. Thus, it tends to be cheaper—sometimes by half or more.
The same goes for on-prem storage. Some data needs to be readily accessible, and so located on a server that needs to be maintained and secured. There are many more options for data that does not need to be accessible, like historical data.
The longer an organization stays up and running, the greater its older, inactive historical data is in proportion to its active data. This is why archiving is important: It saves this inactive data in a much more cost-efficient way, freeing up the systems that traffic in active data (and freeing up storage budget).
Second Major Difference: Immutability
An important part of compliance with data retention laws is keeping the data in an unaltered, and unalterable, state. This is where the idea of immutable storage comes into play. Immutable storage, such as a WORM (write once, read many) datastore, cannot be altered, even by an administrator. The data is, in a sense, “frozen in time.”
This is important for legal purposes. If data is needed for any reason, it is important to show that it has been stored in a way that resists any sort of tampering or altering. In short, immutability is built into most data archiving solutions, because immutability is important for the very tasks for which archives were engineered. The same might not always be true for data backups.
Another benefit of immutability: It provides built-in protection against ransomware attacks.
An important part of compliance with data retention laws is keeping the data in an unaltered, and unalterable, state. This is where the idea of immutable storage comes into play. Immutable storage, such as a WORM (write once, read many) datastore, cannot be altered, even by an administrator. The data is, in a sense, “frozen in time.”
Third Major Difference: Logging and Tracking
Along with alterability comes the idea of logging or tracking who has accessed a particular bit of data. Having a log of who accessed which data, and when, leaves an important trail of breadcrumbs when it comes to audits, as well as data privacy incidents. Most backup systems do not need this level of logging and tracking—they usually carry just enough information to verify when backup or recovery has been run, and how successful it was. Archiving provides a much more granular level of detail.
Fourth Major Difference: Scheduled Destruction
Once data is no longer needed for compliance purposes, it should be destroyed. That way, it no longer takes up space, nor runs the risk of being compromised (which can be a data privacy issue).
Best-in-class archives, because they are focused on compliance needs, have such scheduled destruction built in. Backup systems usually do not, as that would be antithetical to their purpose of saving data. (At best, backup systems overwrite previous backups, and some let the user determine how many backup copies need to stay current.)
Archiving and Backup: Which Does Your Organization Need? (And How Do You Know?)
Really, most enterprise-sized organizations need both. Business continuity plans need to include solutions for backup.
But those solutions make for a very costly, and mostly inadequate, archiving solution for compliance purposes. Different technology is needed for this.
So, if your organization is discussing disaster recovery and prioritizing things like speed to get up and running again with your production data intact, it’s best to explore a backup solution.
But if, like the customers above, you are looking to retain records or other data for compliance purposes, invest in a data archive.
Barry Burke, storage guru and CTO of Dell EMC for years, has a great way of conceptualizing the difference between the two technologies, looking not at what is done, but what the intent behind the action is:
In explaining the word “archive” we came up with two separate Japanese words. One was “”katazukeru,” and the other was “shimau”…Both words mean “to put away,” but the motivation that drives this activity changes the word usage.

The first reason, katazukeru, is because the table is important; you need the table to be empty or less cluttered to use it for something else, perhaps play a card game, work on arts and crafts, or pay your bills.

The second reason, shimau, is because the plates are important; perhaps they are your best tableware, used only for holidays or special occasions, and you don’t want to risk having them broken.
If plates are data and the table is your production storage system, then backup is shimau: The data is important to save, even at a high cost. Archiving is katazukeru: It’s the system itself that must be cleared so you can get on with the next activity…but, of course, you still want and need to save the plates.
Interested in what an archiving solution can do for your organization above and beyond backup? Take a look at our Omni Archive Manager, or reach out to talk to one of our specialists.

Share posts

Categories
Blogs

Finserv Has More Kinds of Data Than You Think, and It’s a Compliance Nightmare

Finserv Has More Kinds of Data Than You Think, and It’s a Compliance Nightmare

It should surprise no one that today’s corporations generate a lot of data. And they will continue to do so at an increasing rate: From 2020 to 2022, the amount of data the average enterprise stores more than doubled, from one petabyte to roughly 2.022 petabytes. That’s over 100% growth in just two years.
Financial services (“Finserv”) firms create more than their fair share of that data. Even a modest-sized regional bank will likely traffic in as much data as a company ten times its size. But what few experts have come to grips with is the sheer variety of data that finserv companies must manage.
All that variety creates a huge hurdle for data management and compliance simply because most solutions on the market specialize in certain types of data only. This fact has forced most finserv companies to cobble together several disparate solutions…or to forego any sort of data management whatsoever.
And that is creating an extremely large but hidden source of risk for finserv firms.
The Varieties of Data in the Average Financial Institution
Consider for a moment all the sources of data that, say, a regional bank traffics in every day:
  • Transactions at all physical locations
  • Transactions carried out online and via a mobile app
  • Client personal data (name, address, birthday, social security number, etc.)
  • Account information (account numbers, transactions, balances)
  • Spending categorization
  • Credit information
  • Mortgage and loan information
  • Contract information
  • Emails (to the tune of 128 messages sent and received each day, on average, per employee)
  • Employee personal information and pay information
  • Employee logs
  • Analytics data (based on customer spending patterns, segments, new products, customer feedback, etc.)
  • Marketing data (email open rates, website visits, direct mail sent, cost of obtaining a new customer, etc.)
  • Customer service data (tickets, rep notes, inquiries, dates)
  • Network usage and access statistics
  • General data on markets, commodities, and prices
A similar exercise works for other finserv companies (insurance companies, wealth management firms, etc.).
Looking at this list, it’s clear that all this data is gathered, stored, and used by different departments within the organization. In part because of that, data is probably also spread across several systems—for example, an OLTP database for online transactions, an OLAP database so that marketing can do interesting analytics work, an email server maintained by IT, etc.
It’s also clear that this data differs a lot in and of itself. For example, emails are a popular example of unstructured data: Individual emails can vary widely in terms of length and kinds of information, and there is no real formatting that lends itself to classical database storage. On the other hand, transaction data are a good example of structured data: The information is organized into specified fields of known structure and length.
The Problems that Come from Scattered Data
Who cares that there are so many kinds of data being tossed around? Compliance officers should, for one thing. Having different kinds of data in different places can be a complete nightmare when it comes to things like data privacy and compliance. For example:
  • What happens when transaction data is appropriately encrypted in a transaction database but fails to be encrypted when that data is aggregated for analytics purposes?
  • How can appropriate access be maintained? For example, how can institutions ensure that clients have access not only to their account information but also to things like customer service correspondence?
  • Which bits of data are covered by the company’s privacy policy, and which aren’t? Which are included in state and federal privacy laws, and which are not? How would someone even know?
  • What mechanisms are in place to ensure that all kinds of data in every location is destroyed once it reaches the end of its data lifecycle?
Again, keeping track of all that data is, from a compliance standpoint, exponentially more difficult with each kind of data and each new “data home.”
Moving Forward: A Singular Data Archiving Solutions?
To be clear, the issue is not a lack of solutions. The idea of data archiving—that is, moving data from its more readily-usable formats to a kind of “deep storage” for long-term preservation—has been around probably since the library of Alexandria (roughly 222 B.C.). Today, there are literally dozens of data archiving and data storage alternatives on the market.
The real issue is that most of these tend to be one-trick ponies. For example, there are some great examples of email data archiving platforms, and they do really well with unstructured data. There are also document management systems, backup, archival software, monitoring systems, and more. But each one has its own specialty; few can act as a central repository for everything while still managing access, logging, and data destruction as needed for compliance.
Indeed, this patchwork landscape of solutions is precisely what drove our engineers to create the Omni Archive Manager. We saw that there was a need for a single tool that could archive all data, maintain appropriate records management across the data lifecycle, and monitor and control access. The need for such a tool happened to be greatest for financial institutions—precisely because of the amount and variety of data they were generating every day.
It might be the case that some institutions can get along with a “cobbled together approach.” But, with increasing regulatory legislation around data privacy, and increasing sophistication of cyber attacks, those days are numbered. No longer can finserv rest easy, assuming that siloed information will be their saving grace. Soon, even smaller companies will need to archive and monitor their data as if they were huge international companies. Then the question becomes: How quickly can they do it?

Share posts

Categories
Blogs

Need a Ransomware Protection Strategy? Immutable Storage Might Just Be the Key

Need a Ransomware Protection Strategy?

Immutable Storage Might Just Be the Key

Ransomware is becoming a growing problem, and not just for larger organizations. As modern encryption methods become more sophisticated, so do ransomware scams. These threaten encryption of vital data unless a fine or “ransom” is paid. Would-be ransomers are now targeting “mid-market” organizations, hoping that these have fewer resources to detect, repel, or recover from the attack.
But ransomware is, at its core, a scare tactic. As James Scott, Senior Fellow at the Institute for Critical Infrastructure Technology, puts it: “Ransomware is more about manipulating vulnerabilities in human psychology than the adversary’s technological sophistication.”
Which means that, far from being the cause of ransomware attacks, technology is the solution.
The Growth of the Ransomware Threat in 2022
That ransomware poses a sizable security threat to organizations is not news. The FBI’s Internet Crime Complaint Center (IC3), which provides the public with a trustworthy source for reporting information on cyber incidents, reported some 2,474 ransomware complaints in 2020. This reflects a whopping 225% increase in ransom demands due to ransomware. Those demands are thought to total $16.1 million in losses in the U.S. The amount lost to ransomware worldwide is on an order of magnitude greater than that.
In addition, the Cybersecurity & Infrastructure Security Agency (CISA) reported in February 2022 that it was aware of ransomware incidents within 14 out of 16 critical infrastructure sectors in the U.S.
But it’s not just big businesses and critical infrastructure that are being targeted. Ransomers are starting to target smaller organizations. Automation and increasingly sophisticated techniques are allowing criminals to scale their efforts to hit more of these smaller companies. And, without proper resources or protection, those smaller organizations are more likely to be vulnerable to such attacks—and to pay the outrageous ransoms.
Which raises the question: What can small and mid-sized organizations with fewer resources possibly do to mitigate or prevent the potential damage done by ransomware attacks?
Recoverability Renders Ransomware Useless
Naturally, the first line of defense against ransomware would be to prevent the infection and spread of malicious software to begin with. But that’s a tall order. Today’s modern organizations have multiple databases, tied in with multiple outside networks (vendor databases, for example). Add in the human element—someone falling for a phishing scheme, for instance—and it’s likely that most organizations have compromising ransomware somewhere in their digital ecosystem.
But if organizations can’t prevent ransomware from taking hold, they can render it useless. In fact, cybersecurity experts often recommend not paying the ransom. This keeps money out of the bad actors’ hands and lowers the chance they will do it again.
That makes recoverability the lynchpin of any ransomware defense strategy. If an organization can recover their data and applications from a point in time before the ransomware infected the system, they can refuse the ransom while minimizing their losses.
Immutable Storage and WORM for Recoverability
Immutable storage (more specifically, immutable backups) are a part of any organization’s data recoverability efforts. An immutable backup is a backup file that can’t be altered in any way. Most systems for immutable backup also have extensive logging capabilities for recording who accessed what bits of data, and when.
Immutable backups should be created using a WORM (write-once-read-many) designated database or data archive. In a WORM system, data cannot be changed, overwritten, or deleted—not even by the administrator. This means that a bit of ransomware cannot overwrite the data present in the database with an encrypted form.
The idea of WORM is not necessarily new—some forms of WORM have been around for decades. What is new is incorporating WORM data archives into a modern infrastructure dominated by cloud applications and APIs.
Considerations for Using WORM to Protect Against Ransomware
Of course, immutable storage is not a cure-all when it comes to ransomware. A 2021 article in TechTarget, for example, takes aim at the idea that immutable storage can be used alone, or that it is really a “last line of defense” against ransomware.
The overall idea is correct: Immutable storage should be part of a larger, more holistic strategy to prevent and combat ransomware breaches. The takeaway here is that, when WORM databases are set up and maintained correctly, they do offer solid defense against this costly kind of attack.
That setup and maintenance should include:
  • Evaluating storage systems for “backdoors” that could give would-be ransomers the ability to remove WORM designations or delete whole clusters serving backup functions.
  • Employing a suitable versioning system (for example, creating new versions of backups rather than appending to, or changing, previous versions).
  • Scheduling backups and maintaining versions at an interval that makes sense for business continuity.
  • Monitoring access logs to identify unauthorized users or suspect locations.
  • Employee education and training, so that ransomware does not take root to begin with.
Again, immutable storage cannot detect or discourage ransomware attacks. But, to use a medical metaphor, it can bolster the organization’s immune system to fight and recover from such attacks when they do happen.
Do you have further questions about ransomware and immutable storage? Or just need help setting up your own immutable storage solution? Reach out to us so we can discuss the possibilities.

Share posts

Categories
Blogs

Cybersecurity and Data Privacy: Integrated Teams Need Integrated Solutions

Cybersecurity and Data Privacy

Integrated Teams Need Integrated Solutions

Each year, countries pass new data privacy laws, each with their own regulations and requirements. For companies that do business in America, the picture is even more stark. Four states already have data privacy laws on the books, with an additional 15 states with bills in active consideration, and 14 that have proposed at least one in the past. In total, that’s 33 different laws… in just one country. While the rules in these laws vary, what remains the same is the need for strong cybersecurity efforts to ensure that user data is protected.
The increasing demands in these areas can leave companies feeling like they’re slowly getting buried and unable to dig themselves out. Other companies may be able to keep up with new developments, but expend an exorbitant amount of resources to make it happen. Either way, large and enterprise firms need a better say to keep their systems secure and compliant without breaking the bank or running their team ragged.
Integrating Cybersecurity and Data Privacy Functions
One of the most significant developments in recent years in cybersecurity and data privacy is the growing realization that these two areas hold more in common than many may think at first glance. Cybersecurity often works from the outside in—preventing breaches using protocols that manage external access to a system. However, how a company organizes, secures, and encrypts data at rest can have a massive impact on the effectiveness of breach prevention and mitigation.
Data security, on the other hand, works from the inside out. You may have a ton of data, with different rights requirements. Some employees should be able to access all of it. Others only need access to some of it, and there are likely some who shouldn’t see any of it. Complicating the matter are the variety of third parties that need access to your data: Analytics firms, payroll vendors, and integrated technology partners. Data must be kept fluid and usable, but sensitive data should be masked or encrypted to prevent such unauthorized access from within. Again, determining the manner in which your company organizes, secures, and encrypts data is a significant part of this team’s operations.
So, even though the day-to-day operations of these two areas may feel like they’re separate, they share many of the same underlying core tasks. In some companies, different teams run these operations, while in others, it may be different people on the same team—or possibly a fully integrated security team. But no matter how a business approaches cybersecurity and data privacy, how it sets data governance policies affects both efforts.
The Human Element
Unfortunately, correctly structuring your cybersecurity and data privacy operations isn’t enough to overcome all problems. There’s one weak link that threatens to bring down operations every day, and it’s one that often goes under the radar or gets misconstrued as a major strength: The human element.
Humans are a major cause of vulnerabilities. We tend to use weak passwords, even though we know better. We fall for phishing attacks and give our credentials away for free. Or, we fail to secure data, leaving it open for others (employees, consultants, vendors, or even hackers) to access.
But worse than mistakes that come from laziness or ignorance are those that we make by design, believing them to be good. As Forbes points out, we often make policy decisions that compound our difficulties instead of relieving them. When we add tools that provide new features for our workers, better security, or other benefits, we’re also adding maintenance, proactive security actions, and overall monitoring to our agenda.
Better Protection with Automated Solutions
The key to overcoming human tendencies to undermine security is through the use of controlled workflows. By ensuring that people work through each of the required steps for security and compliance, companies can limit the mistakes their employees make.
While companies’ strategic decisions matter when it comes to security and data privacy, a huge portion of the success or failure of these programs comes down to the consistency of execution. Companies first learned this lesson when it came to patching: More than 60 percent of data breaches have been traced to missing operating systems or application patches. This fact help spurs the move to the cloud, or to automated systems to ensure that the right steps are done every time, often without requiring worker input. The same is now true of data privacy and compliance workflows: Automating them saves time, but also ensures that critical actions don’t fall through the cracks.
Whether streamlining regulatory compliance, books and records management, or database optimization, automation can help reduce the amount of time that your employees are spending on non-critical tasks and lower your long-term costs. Automated processes can also prompt employees to take actions that only they can complete, ensuring that regulatory and security actions are completed on time.
How Infobelt Helps
If you find yourself using a ton of different platforms to manage your data security, then adding one more isn’t always the right move. But, if that one platform replaces everything else you’re using, taking away the headache of integration and paying for service agreements across a wide range of tools, the savings in time and money can be massive.
At Infobelt, we’ve worked hard to create tools that help enterprises take control of compliance and data privacy while still ensuring the visibility needed to run a modern tech stack. Building the right data management strategy, managing access, ensuring a smooth compliance workflow, and developing a storage strategy can be huge challenges, but Infobelt makes it easy. And since new threats and new laws never stop, we don’t either.
Schedule a demo today to get matched with the right tools for compliance with financial, healthcare, or energy data privacy laws and simplify your data compliance processes.

Share posts

Categories
Blogs

Efficient workflow and automation for compliance

Efficient workflow and automation for compliance

There’s a tension between how company leaders and employees see compliance. For leaders, the question is “How?”—or, more to the point, “How quickly?” (As in, “How quickly can we put a compliance program in place?” “How quickly can we provide reporting for this audit?” and so on.)
But for employees, the question is too often, “Why bother?”
Take, for example, the case of a European bank, as reported by McKinsey. They found that the firm’s early-warning system and handover procedures were “on paper only,” with frontline employees either entirely ignorant of what was required of them, or blatantly ignoring the policies—much to the shock of senior management.
Before learning about the “How” of compliance, company leaders need to look at the common sources of compliance failure. Once those are pinpointed, we get a better sense of what’s missing. As it turns out, investment in compliance activities need not focus on more leadership, more adults, or more binders filled with policies. When dealing with compliance failures, simple and efficient workflows are everything—which means investing in automation.
The Sources of Compliance Failures
Different authors will point to different sources of compliance failures. For example, one HBR article identifies poor metrics and a “checkbox mentality” as contributing to poor compliance programs. Another industry white paper puts “lack of leadership” and “failure to assess and understand risk” at the top of the list.
Whatever the structural reasons, most compliance failures come down to a single employee or team failing to take the necessary steps. The daily compliance workflow is where the rubber hits the road, and if that workflow is burdensome or complicated, it often doesn’t get done at all.
Take, for example, this more recent study by Gartner, which surveyed 755 employees whose roles included some measure of compliance activities. The reasons these employees gave for compliance failure paint a telling picture:
  • 32% percent said they couldn’t find relevant information to complete compliance activities,
  • 20% didn’t even recognize that information was even needed,
  • 19% simply forgot to carry out compliance steps,
  • 16% did not understand what was expected of them, and
  • 13% “just failed to execute the step.”
Creating rules or policies is one thing. Many enterprise-sized companies can boast binders full of company policies and procedures created for compliance purposes. But unless those obligations are properly integrated into employees’ daily workflow, there will always be steps that “fall through the cracks.”
On the other hand, having an efficient and automated compliance workflow makes compliance tasks less burdensome.
Compliance Workflows and Compliance Automation
Employees often have a rhythm or cadence to their day. As they interact with various teams, their contributions form a workflow through the organization. Compliance activities have a workflow, too. The problem is that work time is a limited resource, and so time and effort spent in one workflow naturally takes away from others.
Compliance workflows, then, are the specific, concrete steps needed to ensure the organization is aligned with both internal controls and external regulations. Which steps are required in a compliance workflow depends on the regulations and controls involved.
Take, for example, the steps necessary to comply with data privacy laws (such as the GDPR, CPRA, parts of HIPAA, and so on). These laws are often a balancing act between right-of-access provisions and provisions for ensuring that private information is kept secure. To keep in compliance with both, any handling of documents needs to include a compliance workflow, including steps like:
  • Sending or returning acknowledgements
  • Tagging documents with appropriate metadata
  • Storing documents securely
  • Responding to requests for documentation
  • Getting signatures/approvals from the right parties
  • Destroying records after their required retention interval expires
Identifying the relevant workflows is just the first step, however. Even the most meticulously defined workflow will suffer from compliance failures if those steps have to be done manually for every document. This is where compliance automation comes in.
Compliance automation is the process of simplifying compliance procedures through the use of intelligent software. Taking the above example of document management, such software could automate compliance by:
  • Routing documents to different people and departments as needed
  • Sending receipt acknowledgment automatically as soon as documents are opened or accessed
  • Storing documents securely in a central depository
  • Tracking document access
  • Masking sensitive information when data sources are queried
  • Automating signature-gathering
  • Scheduling document destruction when retention periods end
Document management is just one example of an area where compliance tasks are routinely forgotten or ignored, simply because the compliance workflow can be overwhelming. Automation centralizes workflow tasks and ensures that the right activities are prompted at the right time, throughout the document’s lifecycle. The same can be done, in theory, for things like marketing and sales collateral approval, certification processes, attestations, financial reporting, and more.
But Is the ROI There?
Cards on the table: Our own data-archiving platform, Omni Archive Manager, was specifically designed to do the above: Automate data management capabilities, including compliance tasks, to reduce risk and satisfy legal and regulatory requirements. It was created specifically because we saw how much financial services firms were losing, either due to manual processes that were too complex, or due to outright compliance failures.
What we saw out in the field has been borne out by research, too. For example, a study from the Ponemon Institute and Globalscape looked at the overall cost of compliance and non-compliance across several industries and found that:
  • Non-compliance is 2.71 times more costly for an organization than investing in compliance.
  • The largest costs associated with non-compliance had to do with business disruption and productivity loss. Both were many times more costly than associated fines and penalties for non-compliance.
  • Corporate IT bears the majority of compliance costs, a sign that infrastructure and automation play the leading roles in compliance activities.
We have seen this kind of ROI for our clients as well—though different organizations will see different results, of course.
The Takeaway
To really get serious about compliance, company leaders have to do more than ask “how” questions. They must take a hard look at what compliance looks like on the ground.
When one does that, it becomes clear that most compliance failures are a matter of compliance workflow issues. Creating a better user experience through compliance automation can relieve many of those workflow issues. Fewer compliance failures are what will truly fuel greater savings for the organization.
So many other areas of business have been profitably automated—why would compliance be any different?

Share posts

Categories
Blogs

What Will it Take for Financial Services Firms to Regain Customer Trust in 2022?

What Will it Take for Financial Services Firms to Regain Customer Trust in 2022?

If there’s one industry where consumer trust leads to success, it’s in the financial services industry. But when researchers look at the data on customer trust with financial institutions, the picture gets muddy. It seems that customer trust is high, but eroding. A mix of demographic changes and industry disruption is, in effect, “shifting the customer trust landscape.”
Financial institutions are in a much different place when it comes to trust than they were after the 2008 financial crisis and ensuing recession. Regaining customer trust now, in 2022, is not an exercise in restoring faith in institutions. It requires convincing a new generation of consumers that individual brands are being “above board,” especially when it comes to things like data privacy and governance. Firms that can do that will realize a major competitive advantage.
How the Customer Trust Landscape is Shifting
First, there’s good news: U.S. households are more likely to trust financial institutions than other insitutitions, including government agencies and fintechs, and especially when it comes to safeguarding their personal data. That was the finding of a survey of 1,300 heads of households done by Bank for International Settlements, and the data held consistent across age, ethnicity, and gender.
That said, overall trust in financial institutions is declining somewhat. A Morning Consult poll of over 4,400 U.S. adults found that, while 13% said they trust the financial industry more than they did a year ago, a full 17% reported that they trusted the industry less. While that swing in opinion is not as large as for other industries (social media companies and entertainment companies, for example), it still shows an emerging “trust gap” to which financial services firms need to pay attention.
That gap is set to grow rapidly. Younger generations especially are reporting that they trust Fintech companies more than any other financial companies, even large National Banks. That trust might come, in part, from familiarity…but that cannot be the whole picture. For one thing, the higher level of trust in Fintechs holds even for those in their 40s—a cohort that did not grow up with smartphones and mobile apps. Other factors, such as a feeling of objectivity and well-defined processes, might also come into play.
Another major factor is consumer opinion about data privacy itself. Trust in a company starts with everyday interactions with customers, and data privacy is a key area where those customers will feel the impact directly.
A Focus on Trust Can Be a Major Competitive Advantage for Financial Services Firms
Indeed, consumers are not shy when it comes to what they want. Most of them simply want to have a clear idea of how their data is collected, tracked, used, and shared. And they want the option to “opt out” of certain activities (sharing contact information with a third party, for example).
In short, having strong data privacy management, with a clear communications component, will go a long way to establishing consumer trust across platforms and user experiences. Closing the trust gap here will be a huge competitive advantage for firms—but it will require a collaborative effort between customer service, risk, IT, and compliance.
But is that collaboration worth the cost? Even putting aside issues of company culture—making decisions around products and service lines rather than around customer concerns like privacy, for example—there is the issue of having the appropriate technology. Lacking the appropriate technology for streamlining reporting and tracking can be a serious hurdle, while investing in the appropriate RegTech makes a strong data privacy management program that much easier.
RegTech is the “Behind the Scenes” Basis for Building Trust
To put the point a little more bluntly: Nothing will betray customer trust more quickly than that moment when someone from the firm has to look the customer in the eye and say, “Your data may have been exposed.”
Or, when the firm cannot give a straight answer to the questions “So when is my data shared? And how is it used?”
The problem here is usually one of scale: The average financial services firm processes thousands of documents every month. Much of that includes various forms and requests that carry with them pieces of personal data and financial data. That data is often a mix of structured and unstructured.
Ingesting and managing all of that data—and doing so in a compliant format—is a huge task. Tracking that data once in the system, and maintaining appropriate access control, can be even harder. This must include audit capabilities that can show who worked with what data, and when (with timestamp). And finally, there has to be a guarantee of full data destruction, so that any attempt to recover or restore data fails when that data truly should be absent from the system.
This is why RegTech must be the basis for data privacy management as well as compliance. As much as some authors want to argue that customer trust is a “human issue,” or a “cultural” issue, trust comes down to this: Are firms making it easy for their employees to “do the right thing” when it comes to handling, archiving, retrieving, tracking, and sharing data?
Think about it: No one trusts Amazon to deliver to their door because Jeff Bezos seemed like a nice guy. They trust Amazon because the company has obviously made the investment in technology, logistics and infrastructure. How much more important will those investments be in the competitive financial services industry?
If the numbers we’re seeing are any indication, they will be very important. Financial institutions cannot gain trust just by talking about the privacy game. 2022 will be the year where they put their money where their mouth is.

Share posts

Categories
Blogs

The Role of RegTech in Banks’ Digital Transformation

The Role of RegTech in Banks' Digital Transformation

Financial services firms have recognized digital transformation as a means to ensure compliance with regulators. Due to expanded offerings from RegTech firms, transforming how financial service firms manage their regulated data has never been easier. Financial services firms will continue to increase investments into RegTech firms that offer data management transformation, back-office automation, modern technologies, and a reimagined workforce to limit their future regulatory compliance exposure.
Options to work with third-party providers, such as Infobelt, can offer fast deployment options. With every financial institution looking to drive operational efficiencies, it is more important than ever to understand how much data they have, how many hours it takes to manage it, and how the regulators might use it.
Each organization will need to determine what works within the parameters of existing and desired business models. Traditional financial institutions will feel increased pressure to act quickly and decisively regardless of the selected path. Here are the most critical digital banking trends impacting the adoption of RegTech.
Keeping up with Regulators Data Technology
Regulators continue to improve their capabilities with the use of AI and other data aggregation technologies, such as Supervisory Technology, also known as “SupTech.” SupTech is described as the use of technology by supervisory and regulatory agencies to improve efficiency in their duties overseeing industry. As the adoption of SupTech increases, so does a firm’s vulnerability in exposing non-compliance. A recent paper published this past December by BIS noted
Reinventing Data Management Processes
“The key for financial services firms to ensure compliance is to have all the necessary information that a regulator may require,” says the Chief Operations Officer at a leading RegTech company. “Sure, regulations are changing, but what changes more often is what regulators look at to ensure compliance.” Not only is it essential for banks to know what data should be retained, but it’s also imperative that they know precisely where it is. Relying on outdated data management processes will only continue to hurt financial services firms. By reworking their current data architecture, firms can automate processes to capture regulated information in one place where it can be indexed and retained for any required amount of time.
A Changing Workforce Increases Data Capture Needs
The need to transform digitally will be crucial in managing remote work’s success. The firms that can adapt how their regulation information is captured will be better prepared for information requests by regulators. Gartner reports that between 2016 and 2019, nearly 75% of the required job skills had changed by more than 40%. The use of new technologies has driven the need for new soft skills. Gartner says that financial institutions need employees who can collaborate, innovate, adapt, and persevere through business disruption, not to mention that close to 75% of financial services firms’ employees will continue to work from home in 2022.
Consumer Insight Will Be The Differentiator
The fuel that powers digital banking transformation is data and analytics. In 2022, customers will increasingly expect their financial institutions to know, understand, and reward them in real-time. Using internal resources and partnering with firms like Infobelt, financial institutions can replicate the intelligent experiences their customers have become accustomed to with Amazon, Google, Netflix, and others.
In conclusion, financial services firms must embrace transformation and keep pace or get ahead of technological change and data management to remain compliant and competitive. Internal modernization provides broad advantages and responds to marketplace needs. To streamline back-office operations and make more informed business decisions, financial services firms can take the lead from FinTech and big tech organizations. As data is centralized across the organization, the information can be analyzed to create a better experience for their customer base.

Share posts

Categories
Blogs

RegTech Industry Focal Points for 2022

RegTech Industry Focal Points for 2022

“Digitalization is not confined to the banking industry, of course. But it has already left a strong imprint on banks, and all signs point to even more sweeping changes ahead.”

Andrea Enria, chair of the Supervisory Board of the European Central Bank, September 2021

The RegTech industry had a banner year in 2021, as the COVID-19 pandemic continued to fuel the already existing trend of digitization of everything in our daily lives. Financial institutions (FIs) have followed suit and accelerated their pace to meet consumer experience expectations and regulatory demands involved in digital transformation to compete and survive in this increasingly unpredictable post-pandemic world.
For the upcoming year, 2022, the RegTech sector anticipates the following themes to become focal points:
The War for Talent During the “Great Resignation”
Forrester predicts that in 2022 “Banks will double down on innovation” by spending lavishly on technology, talent, and fintech. Banks will find stiff competition with tech companies and other sectors for top engineering talent and technology workforce.
Deloitte also reports seeing the need for technology talent. Individuals with process and technology backgrounds, be it in investment management, banking, capital markets, insurance, or real estate, will be highly sought after. “In fact, in every sector, we’re seeing the opportunity for this relationship between focusing on transformation and technology and the need to have talent that can help with marrying that technology to a much more seamless process to help improve the experience of the customer overall,” says Deloitte’s Monica O’Reilly.
Already, JP Morgan has ramped up recruitment for its new UK digital bank and plans to take staff numbers above 1,000 in 2022. Additionally, opportunities for new roles within the global anti-money laundering market will arise as the market expands at a compound annual growth rate (CAGR) of 15.6% from 2021 to 2028, reaching USD 3.19 billion by 2028, according to a new report by Grand View Research, Inc.
People are resigning from their jobs in droves during the pandemic, unveiling a rarely seen time where the job market favors the candidate. As the war for talent wages on, an influx of new regulations, increased incidents of fraud, evolving customer demands, and the move towards digital transformation usher in new, more specialized roles. Finding the right talent fit may be a pain point for FIs in 2022.
Financial Institutions will Fully Embrace the Cloud
In 2022 financial institutions look to accelerate moving more data and compliance processes to the Cloud. By the end of the year, on-premise deployments will be a thing of the past and will soon be considered archaic. This change of opinion emerged from the COVID-19 pandemic, as the rise in hybrid or work from home (WFH) arrangements demonstrated to FIs how valuable the flexibility of the Cloud is.
The challenge of ensuring compliance within a distributed workforce is better suited for cloud-based RegTech solutions. These solutions allow financial institutions to scale their financial workflows to meet hybrid or WFH requirements and reduce compliance costs by eliminating the need for on-site computing and storage.
Operational Resilience Continues to be of Key Importance
Predating the pandemic Regulators have long been concerned about operational resilience. The difference now is that the pandemic has accelerated its development and implementation. Operational stability pertains to the idea that digital solutions, which may operate critical business functions, must be resilient to any disruption. Also, risk and compliance applications must accommodate any shift to alternative working patterns such as remote and flexible working arrangements.
Thomson Reuters’ recent report defines it as such; “There appears to be a consensus that operational resilience entails the following main steps: identify, prepare, respond and adapt, recover and learn. Versions of operational resilience definition can be applied in all jurisdictions as the basis upon which to develop approaches to operational resilience.”
This past March, the UK’s Financial Conduct Authority (FCA) and Prudential Regulation Authority (PRA) published their final policy and supervisory statements on operational resilience. It stated that by March 31, 2022, firms must have identified any critical business services, determined maximum tolerable disruption, and undertaken the necessary mapping and testing. As soon as possible after March 31, 2022 – and no later than March 31, 2025 – organizations must have performed mapping and testing to remain within impact tolerances for the services.
In general, financial institutions that struggled with manual or legacy solutions throughout 2020 or 2021 should expect regulators to become more demanding in 2022.
Prioritizing Holistic Compliance Will Ease the Burden of Regulatory Change
Holistic compliance has been one of the most vital points of emphasis in the RegTech world in 2021. “In the last year, there has been much talk about the value of holistic compliance in managing existing and future obligations. 2022 will be the year where this will turn into action and firms start to invest in holistic or integrated data capabilities,” says Matt Smith, SteelEye, CEO.
FIs are now changing how they approach regulatory change. In the past, financial firms have inefficiently addressed compliance by tackling different regulations and implementation deadlines as separate projects – creating other teams, solutions, and data sets for each obligation. The challenges of ad hoc or project-based approaches to addressing compliance needs have clarified the long-term benefits of working with consolidated data, platforms, and processes.
Holistic, all-in-one solutions have recently gained popularity as many more companies were challenged to remain compliant in the face of pandemic-related restrictions.
“The general concept of a holistic solution is to offer a range of compliance procedures in a single package, as opposed to going with individual vendors, each of whom offers a best-in-class solution for a particular compliance requirement. Arguments against this approach hinge on the idea of dead weight, of paying for unnecessary features, and even on the assumption that holistic vendors might boast an impressive range of features and solutions without providing the specificity and reliability of best-in-class vendors” stated Ben Parker, CEO, and founder, flow Global.
One possible outcome is for FIs to seek heightened coordination among vendors. Strong vendors could offer some benefits of a holistic approach while minimizing the downsides. However, resource constraints among regulated companies may, in some cases, be a limiting factor in their immediate ability to implement more optimal, holistic solutions. Conversely, for most vendors, their limited resources hamper creating a genuinely compliant and genuinely holistic solution.

“We are already beginning to see an increase in integrated partnerships, white-label partnerships, and other such agreements between RegTech vendors, and we believe this trend will continue in 2022”

concludes Parker
2022 Key Points: The War for Talent, the Cloud, Operational Resilience and Holistic Compliance
The concepts outlined above are not entirely new topics for the RegTech sector. As we welcome the new year within the continued presence of the COVID-19 pandemic in our daily lives, it is not easy to know what to expect next in this ever-changing environment. The Cloud migration has been progressing for years now, and the concepts of holistic compliance and enhanced operational resilience. However, what has changed is the aggressively accelerated pace of the digital transformation within compliance.
Operating within the confines of regulatory expectations is vital to help ensure stable and secure financial markets. The regulatory landscape will continue to be unpredictable and challenging. If financial institutions are to thrive in 2022 or beyond, they need to think strategically and gather the right digital tools to assist them in their continued compliance journey.

Share posts

Request a Demo

Speak with a compliance expert today to learn how your enterprise
can benefit from Infobelt’s services.

Rijil Kannoth

Head of India Operations

Rijil is responsible for overseeing the day-to-day operations of Infobelt India Pvt. Ltd. He has been integral in growing Infobelt’s development and QA teams. Rijil brings a unique set of skills to Infobelt with his keen understanding of IT development and process improvement expertise.

Kevin Davis - Infoblet

Kevin Davis

Founder and Chief Delivery Officer

Kevin is a co-founder of Infobelt and leads our technology implementations. He has in-depth knowledge of regulatory compliance, servers, storage, and networks. Kevin has an extensive background in compliance solutions and risk management and is well versed in avoiding technical pitfalls for large enterprises.