Before any big data project is kicked off, just like any other project, there are certain prerequisites for it to be successful:
As Hadoop is a new technology, its adoption into financial organizations is not easy and is faced with various obstacles such as:
Start with small "low-hanging fruits"—process already structured data that saves cost and whose benefits are easy to quantify.
Even if many of us think that successful Hadoop implementations are dependent on IT or the Technology department, it is not true. For any Hadoop implementation, the changes need to cut through its business culture, operating model, and data architecture are as explained:
Most of the successful implementations in financial organization are usually done in three steps, discussed in the next three subsections.
Because of the hype around this technology, there may already be a good level of interest in financial organization, but that needs to be continuously fuelled by providing practical experience and sharing success results.
It is possible for organizations to provide a Hadoop play area for developers and analysts to load and analyze real or testing data. The Hadoop platform may either be built using unused servers or the strategic purchase of Hadoop servers, including distributions for development, testing, and production.
What should developers and analysts do?
Most importantly, developers and analysts should share their results with business users and the wider community within financial organizations. This is an important step in moving to the next step—doing a project with real data and business benefits.
Once the business benefits are documented, data and technical architecture designed, and the team already skilled in Hadoop tools, it is a time to do a small project or a proof of concept for a large project.
What should developers and analysts do?
Once the project has been put into production, the minimum expectation is that:
What should developers and analysts do?
18.220.152.139