Crowd Control
Whether it is offshoring, outsourcing or automation, the financial industry is always on the lookout for new ways of getting more done for less. On March 27, Inside Reference Data hosted a webcast about a new operational model called crowd computing, which promises such efficiency.
During the webinar, which was sponsored by WorkFusion, Adam Devine, the company's New York-based vice president, product marketing and strategic partnerships, explained how crowd computing platforms, such as the one his company provides, allow users to bring together a hybrid workforce of subject matter experts, freelance workers and automation tools to work on particular projects. The platform learns from the output of the human workers so that over time, a greater proportion of work can be automated and headcount can be reduced.
Success Story
Crowd computing is in use today in a number of industries, including healthcare, the media and financial services. Among data providers, Thomson Reuters was an early adopter. Peter Marney, now New York-based vice president, global content management at publishing company John Wiley and Sons, explained why Thomson Reuters adopted WorkFusion's platform during his time there as senior vice president, platform and information strategy, which ended in June when he began at Wiley. Marney, who is an industry advisor to WorkFusion, said he had been responsible for a database of corporate entities, which was growing so fast it was difficult to keep the records up to date and add new ones. He said the ability to combine a crowd of outsourced workers with automation tools and subject matter experts met Thomson Reuters' needs exactly.
Hubert Holmes, New York-based managing director, reference data, at Interactive Data, said his company is not using crowd computing at the moment, but has explored how it could help to keep down the cost of collecting more data.
"I find crowd computing to be potentially a very cost-effective methodology of getting at lots of public data," said Holmes. "As we expand our content sets and do linkages across legal entity and security data, and really enrich our data more, that entails bringing in new data sets. To the degree that has been cumbersome—you have to add people and systems—and costly, we haven't done it as much as perhaps we would have liked. If we can get the cost of data collection down and scale it—turn it on and off—or if you have projects to create certain data sets, it would give us that impetus to do it."
Sweet Spot
Keith Broadhead, New York-based head of solution sales, Americas, at SIX Financial Information, was new to the concept of crowd computing, but said he believed it could help data vendors manage the challenge of analyzing, normalizing and categorizing big data.
A poll of listeners found 45% of the audience believed market data products would benefit the most from crowd computing. Broadhead was unsurprised to see market data as the top choice, but said he expected evaluated pricing to be more popular as it is such a hot topic at the moment.
Crowd computing can be used to harness the work of thousands of freelancers, as well as permanent members of staff. With so many people contributing to the management of data via a crowd computing platform, one listener questioned how the security of the data could be guaranteed.
Max Yankelevich, WorkFusion's chief architect, CEO and co-founder, explained that security is ensured because each data management job is broken down into so many separate "micro tasks" that no single worker has access to any sensitive information. "Once you break down the process into micro tasks, you are effectively putting this process through a shredder where each crowd worker sees only a very, very small part," he said. Yankelevich explained that this parsing of work into many tasks means crowd computing has, for example, been used to analyse tax returns without any threat to the security of personal data.
While breaking down a job into a series of micro tasks can take some getting used to, it is essential for the success of a crowd computing project, said Marney. He explained Thomson Reuters considered the filling of an address field to be a single task, but for the purposes of its crowd computing project, it ended up being broken down into 25 micro tasks, including looking up the company website, finding the address page, extracting the address and parsing it.
The lesson? To get macro benefits from crowd computing, focus on the micro details.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Chief data officers must ‘get it done’—but differ on what that means
Voice of the CDO: After years of focus on data quality, governance, and compliance, CDOs are now tasked with supporting the business in generating alpha and driving value. How can firms put a value on the CDO role?
In a world of data-cost overruns, inventory systems are a rising necessity
The IMD Wrap: Max says that to avoid cost controls, demonstrate the value of market data spend.
S&P debuts GenAI ‘Document Intelligence’ for Capital IQ
The new tool provides summaries of lengthy text-based documents such as filings and earnings transcripts and allows users to query the documents with a ChatGPT-style interface.
As NYSE moves toward overnight trading, can one ATS keep its lead?
An innovative approach to market data has helped Blue Ocean ATS become a back-end success story. But now it must contend with industry giants angling to take a piece of its pie.
AI set to overhaul market data landscape by 2029, new study finds
A new report by Burton-Taylor says the intersection of advanced AI and market data has big implications for analytics, delivery, licensing, and more.
New Bloomberg study finds demand for election-related alt data
In a survey conducted with Coalition Greenwich, the data giant revealed a strong desire among asset managers, economists and analysts for more alternative data from the burgeoning prediction markets.
Waters Rankings 2024 winner’s interview: S&P Global Market Intelligence
S&P Global Market Intelligence won two categories in this year’s Waters Rankings: Best reporting system provider and Best enterprise data management system provider.
How ‘Bond gadgets’ make tackling data easier for regulators and traders
The IMD Wrap: Everyone loves the hype around AI, especially financial firms. And now, even regulators are getting in on the act. But first... “The name’s Bond; J-AI-mes Bond”