Big data is here to stay. In case there was any doubt, the Equifax fiasco painfully exposed how much data can rule—and ruin—the day. The most recent issue of The Family Office Alpha Report addressed big data’s transformation of the investment world, especially as the fuel for quantitative/algorithmic strategies.
But while data has provided an edge and various efficiencies for many managers, wealth advisors, and regulators alike, the dark side of large data sets has also become evident. The Equifax hack, which exposed the personal and financial information of nearly half the US population, and other recent security breaches including the hacking of Whole Foods and Sonic customers’ credit card information and the WannaCry ransomware attack, bring to light an ironic truth: As we provide more information and data to “protect” ourselves and our assets, we can actually increase the risk of compromising privacy. Aside from big data, what about the pitfalls of bad data?
Big data and its associated benefits are optimized when there is more data to access. The greater the quantity of data, however, the more challenging it becomes to keep the information private and secure. Big data is also imperfect unless it is always clean, accurate data. A driverless car, for example, is only as good as the data it consumes. Given today’s trend of “fake news,” and the fake Facebook and Twitter accounts created by Russian entities to influence elections, the challenge of whether big data systems can recognize and analyze real data versus flawed data remains paramount. Could an artificial intelligence asset management system accessing big data in its self-learning process be thrown off by fake news or by intentionally bad data?
Asset managers and investors alike walk a tightrope between revealing enough information to keep business chugging along and protecting privacy. Transparency rules the roost when it comes to the manager due diligence process: investors expect unfettered access to all financial data, background information, track records, due diligence questionnaires and details on any missteps from asset managers—on a regular basis. The scrutiny is magnified when investors request separately managed accounts and side letters. This diligence is to protect the interests of the investor.
This can be onerous to an asset manager, especially a small or emerging firm with limited staff; but it can also present confidentiality challenges. Most strategies rely on proprietary fundamental or quantitative processes and data that are at the core of their “special sauce.” Revealing their cards, in the guise of investor diligence/protection, can put asset managers at risk or even out of business.
Investors themselves are not off-limits when it comes to diligence and sensitive data. Accredited investor regulations which purportedly protect investors from fraud and other liabilities assume these same investors must permit enough transparency to satisfy the rules of SROs like FINRA. These SROs put the onus on broker-dealers and advisors to certify investors’ minimal standards of net worth and other financial information, which is notoriously opaque and difficult to verify. Again, the protection of the “consumer,” or investor, is dependent upon the release of sensitive data of that very individual/entity.
The SEC’s anti-money laundering (AML) programs and the USA PATRIOT Act were instituted to strengthen measures in detecting, preventing, and prosecuting international money laundering and the financing of terrorism. Yet ironically as part of these safeguard procedures, regulators regularly provide financial firms and others with sensitive data—addresses, social security numbers, birthdates—of suspected persons and businesses, deputizing the financial industry to help combat fraud. On the flip side, privacy laws protect criminals who have perpetrated attacks like the 2015 San Bernardino mass shooting, preventing even the FBI from accessing cell phones.
Bernie Madoff’s Ponzi scheme was a massive data failure. The SEC’s multiple offices, which use electronic codes to identify the firms they are tasked with tracking, failed to aggregate their data, essentially meaning the left hand didn’t know what the right hand was doing—for 16 years. In 2016, the SEC itself was hacked and information in its EDGAR, which processes more than 1.7 million electronic filings per year, was compromised. If the very systems fueled by big data are turning an asset into a liability, where does that leave us? How can managers and investors feel comfortable when the SEC, the protector and regulator of financial data, is challenged itself? Trillions of dollars could be at stake.
We now have created a language we never would have dreamed of decades ago: “cybersecurity forensics,” “ransomware,” “cryptocurrency.” With this new paradigm comes fresh demands that go beyond faith in cybersecurity, diligence and audits. Big data is not going away; it’s only growing more complex as it becomes an increasingly critical cog in the wheels of business. As data evolves, so too will the need for more stringent security measures (security tokens that continuously flash new ID numbers, facial/eye/voice/fingerprint recognition), which are being developed at perhaps too slow a pace.
The big data way of life is fraught with conflicting agendas and ironies, and in many cases advances in data collection and utilization can impede and even imperil business processes. Investors will need to continue to remain vigilant when it comes to their own diligence efforts, ensuring they have access to accurate data. At the same time, they must safeguard that which has been revealed and entrusted in the diligence process, respecting that which may not ever be able to be disclosed. The perils of opening the kimono are real, but in the end, quality must trump quantity.