Big data is the most valuable resource for fintech firms, says Manish Gurnani, Chief Technical Officer from Ksolves India Limited, at the Mumbai Micro Experience of Fintech Festival India 2021-22. Speaking in the session, he addressed the concern about how to handle the big data workloads efficiently and effectively. He specifically highlighted the 5 Vs of big data, how do you define efficiency versus effectiveness, how do you make your big data workloads more efficient, and how effective it is. Talking on big data capabilities, he further talked about the Carsten Lund Pedersen and Thomas Ritter theory, which is mainly related to the big data success framework that helps you decide whether your big data project will be a success or not.
What is the Carsten Lund Pedersen and Thomas Ritter theory all about?
Most executives are focused on big data initiatives that concentrate around utilising data for business improvement and development. However, up to 85% of big data initiatives fail, frequently due to executives’ inability to appropriately identify project risks at the onset. The two eminent professor, Carsten Lund Pedersen and Thomas Ritter, contend that the success of data initiatives is mainly defined by four critical components: data, autonomy, technology, and accountability, mainly known as the four D.A.T.A. questions. These inquiries stem from a four-year study effort on the marketing of big data. According to them, the factors required for big data success can be classified into two categories. The first is the emphasis of the activities such as project conception or implementation, and the second is the focus of the transformation such as technological improvements or gaining public support. These two dimensions combine to form a D.A.T.A. matrix.
The D.A.T.A. Success Framework That Big Giants Rely On!
The Four D.A.T.A. questions formulated by Carsten Lund Pedersen and Thomas Ritter are the following:
-
Data – “Can we access data that is valuable and rare?”
Access to high-quality data is, without a doubt, a must for every data-driven business endeavour. Good data quality management contributes to reduced risks and costs, enhanced efficiency and productivity, more informed decision-making, improved audience targeting, more effective promotional campaigns, better customer interactions, and a shear competitive edge by helping to extract more value from data sets.
Bad data can have a substantial impact on a company’s bottom line. It can distort operational visibility, making regulatory compliance difficult; waste time and labour manually digesting faulty data; present a disaggregated picture of data, making it harder to uncover significant consumer prospects; harm brand reputation; and even endanger public safety.
-
Autonomy – “Can employees use data to create solutions on their own?”
Decentralized decision-making is required to deliver value in the shortest possible time frame. Employees are given autonomy (decentralised decision-making) to come up with ideas for data-enabled solutions that they can implement on their own. It lowers delays, increases the flow and throughput of product development, and allows for faster feedback and more inventive solutions.
Autonomy is an important stage in the brainstorming process, and it also addresses the human side of digital revolutions. This can be understood by the example of Google. Google has a long history of supporting decentralised decision-making and allocating committed resources to various initiatives around the company. Autonomy is vital not just for developing and launching new projects, but also for learning and modifying current ones. Employees must be able to utilise data to begin, build, and adjust their own solutions as a result.
-
Technology – “Can our technology deliver the solution?”
We live in a civilization that is continually evolving, changing, and improving. Companies must continue to reinforce and establish a solid technological foundation across their business to enable the changes that will encourage a wonderful experience for their clients, and most importantly, for their staff, as they grow and expand.
Technology is one of the prerequisites for data success. It is a critical first step in the implementation phase as well as a component of the digital backbone. Organizations that had a solid technical foundation and a plan for fast growing their organization were far more effective than those that had to scramble to find solutions to fulfill the demands of their business. In the end, those firms spent far more money and missed out on many opportunities.
-
Accountability – “Is our solution compliant with laws and ethics?”
The accountability principle demands you to be accountable for your actions with personal data and how you adhere to the other standards. It is a crucial aspect of the human component of digital transformations and is a vital stage in the implementation phase. The following is the question that management should ask in this regard: “Does our answer comply with the law and ethical standards?” If your solution violates the law, it will have little value.
Furthermore, if people see the solution as “disturbing,” you may face criticism in the media. Neglecting responsibility can have serious consequences for all those involved. As a result, no solution should be anticipated to succeed in the long run unless it complies with the law, social ethics, and norms. Under EU data protection legislation, there are seven core data protection principles. The principles are at the heart of the legislation, and while they do not provide hard and fast regulations, they do express the regulatory framework’s essence. As a result, adherence to the principles is a necessary component of any solid data protection approach.
Score Mapping For Your Big Data Project
You can anticipate if your next big data project will be successful or not by tallying your replies to the D.A.T.A. key questions. One point is awarded for each “yes” response. A score of 4 indicates that you are on track to complete your big data project successfully. A score of 3 indicates that the project still requires significant effort to be completed in order to be successful. If you receive a score of zero or one, your project should be discontinued immediately, as it is unlikely to succeed.
Big data projects, in particular, might benefit from a more data-driven approach to success predicting. While big data has immense commercial promise, the results have so far been underwhelming. It’s probable that part of the problem is that corporate leaders don’t know how to predict the success of data efforts since there are few systematic tools for doing so. The aforementioned D.A.T.A. framework, which was also stated by Manish Gurnani, CTO of Ksolves (a leading software development firm) in his tech talk at Fintech festival, can be used by executives to make data-informed choices on big data initiatives.
AUTHOR
Big Data
Anil Kushwaha, Technology Head at Ksolves, is an expert in Big Data and AI/ML. With over 11 years at Ksolves, he has been pivotal in driving innovative, high-volume data solutions with technologies like Nifi, Cassandra, Spark, Hadoop, etc. Passionate about advancing tech, he ensures smooth data warehousing for client success through tailored, cutting-edge strategies.
Share with