CHAPTER 3

A Short History of the Financial Services and How Technology Has Helped and Hindered

This chapter builds on top of the first chapter by providing a chronological history of how the industry has evolved over the past 50 years albeit with a heavy focus on how technology has helped (and, in some cases, hindered) this development. (This chapter is not meant to provide a detailed chronological history with in-depth discussion, but to provide a sufficient context for this book by explaining how the industry has evolved with a particular nod to technology.)

This development can be summarized in the following four stages which do have a certain amount of overlap.

images

Figure 3.1 The four stages of the financial services technology evolution

Each of these stages is discussed in more detail below but it is interesting to note that the development of the financial services industry and technology is very much a two-way development process. As technology has grown then it has allowed the financial services industry to expand. For example, using web/mobile to support self-service gives customers more functionality, and allows firms to have more clients as opposed to requiring large call centers and supporting manual processes. Also, financial services have always been a keen user of technology and this use has helped drive the development of the technology holistically. For example, some of the earliest users of relational databases and certain aspects of Artificial Intelligence were Financial Services firms.

Stage 1—Prehistoric up and Until the End of the 1970s

Originally, financial services was a very manual process with records maintained manually with manual book keeping which was laborious, slow, clumsy, and prone to human error. Consequently, people did not use many financial services products. While some people may have had a bank account but little in terms of savings or insurance products. Transactions tended to be performed using cash. Likewise, because of the cumbersome manual nature of the industry, the firms could only offer simple products with minimal functionality.

While there was some technology in place, it was very primitive and not widely used:

1. In 1844, the Telegraph was rolled out across the United States. This allowed simple messages to be sent from one side of the United States to another. However, its use was limited, it needed specific hardware, it needed humans at both ends, and it was prone to error.

2. In 1865, the Pantelegraph was rolled in France. This was effectively the first facsimile machine. Again its use was limited, it needed specific hardware, it needed humans at both ends, and it was prone to error.

3. In 1866, the first trans-Atlantic cable was laid between the UK and the United States. This was a major step forward because it allow messages to be sent within hours as opposed to physically sending the message via ship.

4. Finally, in 1918, the Fedwire Funds Service was implemented by the Federal Reserve Banks. This allowed banks to transfer money between all 12 connected Banks. It used a Morse code system.

However, during the 1950s and 1960s, there was a growth in technology.

Previously with analogue technology, all computers were physically enormous, slow to operate, and unreliable. However, the discovery of digital technology allowed several step progressions with technology to be made. Firstly, it allowed more powerful hardware (such as mainframe computers and mini-computers) to be developed, which apart from being physically smaller and more reliable provided a massive increase in processing power. This, in turn, allowed more advanced and easier to use software languages to be developed as opposed to complex and hard to program languages such as assembler or machine. Many of these languages were designed with a specific business purpose in mind. For example, COBOL for business and FORTRAN for scientific calculations. Secondly, inter-computer communication was improved by the roll-out of some (albeit limited) communication standards.

These developments triggered many changes for Financial Services firms.

1. Firstly, several internal improvements were made. Book-keeping records were automated, thus removing the manual overheads and operational risks of maintaining manual paper ledgers.

2. Secondly, it allowed several new products to be offered to customers.

In the 1950s, the Bank of America launched its first Credit Card (called the BankAmericard) in California, United States. This would become the first recognized Credit Card.

In 1960, the Electronic Pricing Data firm (based in Los Angeles) launched its Quotron Systems. This was the first electronic system to provide stock market quotations to an individual through desktop (albeit bulky) terminals. It also printed up-to-the-minute prices out on ticker tape.

In the 1960s, the first set of Automated Teller Machines (ATMs) was implemented. These allowed customers to obtain balances and withdraw money from a “hole in the wall” without the need to visit a branch. The first withdrawal was made in Enfield, London in the UK in 1967. One could argue that this is the first customer service offering within the financial services industry.

While these developments sound very primitive compared to the technology available now, they were at the time massive breakthroughs. However, these services were costly which did not make them available to the general public. Also (and this is a common theme for all new technology) there was a certain social reluctance to use the technology because the general public did not trust them. It took several years for the general public to be happy to use them.

Stage 2—Let’s Automate Those Clunky Processes—1971 to 1995

This period covered a massive step increase in the power of technology although most of the benefits were to the inner workings of the firm with limited benefits to the customers.

This growth has often been linked to Moore’s Law. This law (stated by Gordon Moore who was one of the founders of the chip maker Intel) said that the number of transistors increased on an integrated circuit could be doubled approximately every two years. This means that processing power can be doubled every two years as well. This rule allowed a large number of step developments to take place, namely:

1. The processing power of mainframes and minis increased dramatically which allowed more complex operations and functions to be supported. The use of minis become more popular because they were easier to operate than mainframes and they were cheaper to operate.

2. The invention of the personal computer (PC) was in the early 1980s. IBM created the first PC but it was superseded by a flood of cheaper and often more powerful clones. All of these were running MS-DOS which meant applications written for one manufacturer of PC could run across all PC clones. This then allowed an explosion of powerful and easy-to-use packages. This covered spreadsheets, data analysis, and word processors. The growth of PCs started to reduce the demand for mainframes and mini-computers.

3. There were further advances in programming languages on mainframes, minis, and PCs. These languages were easier to use and provided much better functionality.

4. Database technologies also improved and become easier to implement and use. Previously, databases were a collection of clumsy sequential indexed files which were hard to use, maintain, and integrate with. However, Relational Database Management Systems had now become mainstream because they were easy to use, easy to understand, and easy to maintain.

5. The concept of the graphical user interface (GUI) was developed which allowed the user to intuitively interact with a computer using a mouse and set of icons. This suddenly made computers much easier to use. There was no need to remember a long list of commands. One could just click on an icon.

6. The increase in processing power for mini-computers/mainframes, the invention of PCs and GUIs allowed the concept of client-server computing to be born. The theory behind this is that there is a central computer that provides data to many networked PCs.

7. Finally, the first primitive video conferencing systems were implemented, although they were very slow to use because of the very limited network bandwidth.

This explosion of technology improvements triggered several changes to how Financial Services firms operate:

1. There was a continued increase in the automation of manual processes. This covered many areas such as account openings and insurance liability oversight.

2. The increase in database technology and the associated data analysis software allowed firms to genuinely start to perform analysis on their clients, sales, profits, costs, and so on. (It could be argued that this was the first attempt at data mining with the financial services industry.) Firms could assess their client base to see which ones made their money, which ones cost them money, investigate cross-selling opportunities, and so on. Firms could also investigate their operating costs to see where there were blockages so process improvements could be implemented.

3. There was also an increase in harmonizing standards across the industry. For example, the same message format would be used across the industry for sending money, instructing a foreign exchange trade, and so on. This allowed much easier communication between firms which then helped with the increase of industry infrastructure.

This standardized messaging helped enormously with making payments

1. In 1984, the world’s first online shopping took place. Jane Snowball, from Gateshead UK, purchased food from her local supermarket using video technology.

2. In the late 1980s and early 1990s, the Globex trading went live and its standard messaging allowed access to a range of financial assets such as treasuries, foreign exchange, and commodities.

3. In 1986, the UK ATM network LINK went live with 33 banks and building societies which allowed customers of each of the 33 members to withdraw money from any of the members’ ATMs. In 1989, LINK merged with the Matrix Network to provide more coverage for customers across the UK.

This standardized message also helped with trading and general market infrastructure, namely:

1. In 1971, the National Association of Securities Dealers Automated Quotation (or NASDAQ) was formed in New York, in the United States, to allow trading across parties.

2. Finally, the convergence of standards also helped with the forming of SWIFT (Society for Worldwide Interbank Financial Telecommunication) which provides a network for financial institutions to send messages to each other in a secure, standardized, and reliable manner. SWIFT covers a wide range of messages such as Cash, Treasury, and Securities updates and movements.

Another interesting development during this period was the growth in firms purchasing software packages. Previously, firms used to develop and support their software in-house or outsource their computing to service bureaux. This was costly and time-consuming. However, with the growing trend of standardization and requirements being similar, it was easier to purchase a package “off the shelf” to meet the need. These were quicker and often cheaper to implement. For example, there were various packages to allow connectively to SWIFT, make global payments, or maintain books and records.

However, having said this, there was also a trend to take standard offthe-shelf packages and bespoke them for each firms’ needs. This created further complexity and risk. With hindsight, it would have been better if firms would have changed their ways of working to fit in with the purchased package worked as opposed to enhancing the package to fit in with the way the firm worked.

While the above developments were impressive and provided a large number of benefits to the Financial Services, they caused a variety of problems.

1. Firstly (and as mentioned above), the new technologies only really allowed changes to the inner workings of firms or the general market infrastructure. There was little benefit to the customers.

2. Also, technology infrastructures started to become really “complex.” They often consist of different technologies (e.g., mainframes, minis, and PCs), running various packages (covering in-house developed, standard off-the-shelf packages, and bespoke off-the-shelf packages), and a spaghetti of integrations between in-house systems and market infrastructures. This infrastructure needed to be supported and maintained by an expensive staff base covering many different skill sets.

3. Firms were completely reliant on technology to operate which meant any outages or issues could be serious problems. Therefore, firms had to implement controls around their technology to ensure it operated as designed with suitable measures in place in the event of problems. This helped with the rise of formal standards such as ITIL (was Information Technology Infrastructure Library) is a set of detailed best practices covering Service Design, Service Transition, Service Operation, and Continuous Service Improvement.

Also, these technology infrastructures were becoming more and more complex to change. Previously, upgrading a piece of software was a straightforward task but now, because it is integrated with so many other components, it is a risky and challenging exercise that required a large amount of planning, development, testing, and post-implementation support. Again these challenges help with the rise of standards around Release Management.

4. Finally, there is often an issue with social acceptance. Despite the benefits new technology offers, it is not uncommon for certain people to be nervous about using this technology. Therefore, it could take a while for new technology to become socially acceptable. For example, it took several years for ATMs to become popular in the UK.

Stage 3—Customers Are Now Going Online—1995 to 2005

This age provided many exciting developments which (unlike Stages 1 and 2 above) allowed firms to offer functionality, services, products, and so on externally to the end customer to genuinely improve the service and products offered by firms.

Without a doubt, the most noticeable development during this age was the development of the Internet and World Wide Web (or Web). While these two phrases are often used interchangeably, they are different. The Internet is a packet-switching network based on the TCP/IP network protocol. It was originally called APRANET created by the U.S. Department of Defence Advanced Research Project Agency (ARPA) in the 1960s. The World Wide Web is a series of documents identified by a URL (Uniform Resource Locator) which are linked together by a series of hyperlinks and accessed by an application called a Web Browser. The Web runs on top of the Internet.

The emergence of the Internet and the Web was helped by the continuing increase in processing power (remember Moore’s Law from earlier), improvement in programming languages (namely HTML and HTTP), web browsers becoming increasingly mainstream, increases in common standards (to help interconnectivity), the increase in Internet search sites, and the increase in network bandwidth from simple dial-up modems to reliable broadband connectivity.

This perfect storm allowed the Internet (or Web) to become mainstream and offer many opportunities to firms, namely:

1. Firms could develop websites that could allow customers to self-service on a 24-hour day 365 days per year basis. The initial websites were no more than “brochure-ware sites” allowing customers and prospects to download literature, find out information, or request details. However, over time, this functionality would expand to cover easily codified functions such as accessing balances, producing statements, opening accounts, closing accounts, trading, and money transfers.

This provided some benefits to firms as well. By allowing websites to provide self-servicing functionality, it allowed firms to reduce staff levels on call centers or to ensure staff was focused on the functions that cannot be codified, for example: dealing with deaths or very complex client queries.

2. However, there was an interesting development that was not predicted at the time. The Web allowed new more entrepreneurial and aggressive entrants to enter the financial services market by offering new and/or better functionality. This was good for the customer because it created increased and better products and services and products for them. However, it was the first time in many years that there has been a serious challenge to the existing firms within the industry and many of these existing firms were caught “flat-footed” and were slow to react. For example, in 1998, PayPal was launched which offered a new, convenient and flexible method for making payments, and in 1998, the Egg Internet Bank was launched in the UK. It took some of the existing players at the time several years to react to this.

This entrepreneurial spirit also allowed many other new products to be developed. In 2003, Chip and PIN was introduced in the UK for safer and easier payments, and in 2004, card spending in the UK exceeded cash for the first time.

3. The final perk for this technology was easier communication. The Internet allowed standardized e-mail, video, instant messaging, secure messages, and file transfers between firms. This improved efficiency and reduced costs.

Despite the great benefits and opportunities of the Internet and World Wide Web, they did cause several key issues.

1. The already complex technology infrastructures (see Stage 2) were becoming more and more complex. On the top of the existing stack of different technologies, various supplier/in-house packages, and a mesh of integrations, a mass of new components needed to be added. This covered the infrastructure to support Internet connectivity, technology to support various websites as well as software integration links from the websites to “back systems” for processing requests (such as account opening). This infrastructure needed to be supported and maintained by a costly set of staff covering many different skill sets.

However, as firms built websites they have opened up their technology infrastructures to the world. This creates a new type of risk called Cyber. This risk typically relates to any outage, disruption, financial loss, or reputation damage caused by a problem with its technology infrastructures. Specifically, this could cover (a) accidental and/or breaches of security, (b) system outages due to system problems or poor design, (c) somebody deliberately trying to access (or hack) into the system, and (d) somebody trying to steal data. Furthermore also because of the 24 hours-per-day nature of the websites then need full resilience and business continuity processing to cover outages. If a website is unavailable then customers cannot access their details. This type of outage is very visible and can damage a firm’s reputation. Therefore, firms need to implement controls and checks to manage these types of risks.

Therefore, the result is that these technology infrastructures are now unbelievably complex and firms must implement processes and governance to manage the technology infrastructure and the risks associated with it. As mentioned earlier, firms are looking to use formal standards such as ITIL to help with this.

Also as mentioned earlier, these technology infrastructures were becoming more and more complex to change. Any change is a complex process and will need to manage carefully. These challenges help with the rise of standards around Release Management.

2. Similar to the point noted under Stage 2 above, there is often an issue with social acceptance. Despite the benefits the Web offers, people are still nervous about using new technology. This was especially common with online servicing because people were nervous about their details being stolen which in turn made them susceptible to fraudulent transactions. Therefore, it does often take a while for new technology to become mainstream.

3. Finally, several clients do not like the loss of personal contact. While they feel the self-servicing over the Web is useful, they will look to speak to a human about their affairs. They feel nervous about providing confidential details to a “screen” without feeling comfortable with what the “screen” is doing with their details. Also, not everyone must have access to the Internet or World Wide Web. Consequently, firms need to ensure they have a personal element to the client servicing.

Stage 4—Customers Are Now Leading the Industry and Going Mobile—2005 Onwards

This stage effectively brings this short history of financial services and its technology up to date.

This stage can be looked at as the logical progression from Stage 3 where client interaction was moved onto the web to where (for this fourth stage) clients are now using a variety of different hand-held, tablets, phones, and smart devices (over the Internet) to access their details. In effect, clients are now carrying a powerful computer in their pocket and they are demanding instant and full-time access to their details. For example, balances, inquiries must be real-time, or payments must be able to be made at the touch of a button. (This growth is commonly known as the Internet-of-Things and is discussed in-depth in Chapter 9.)

This trajectory has been again been supported by increased processing power (again note Moore’s Law from earlier), more open standards, better and more efficient programming languages as well as increased network bandwidth (with 4G and 5G providing customers with what seems like instant access to their details).

It has also been supported by new and emerging technology such as Big Data, Natural Language Processes plus many others. (However, the rest of this textbook discusses these so I will not cover them in detail here.)

Finally, it is worth noting that this fourth stage is also being motivated by societal changes driven by other business areas (Amazon, Facebook, Netflix, LinkedIn, etc.) which are not financial services firms in their own right. These businesses have revolutionized the way society operates with people expecting everything online with instructions executed at the “touch of a button.” Therefore, financial services need to follow this trend.

This technology capability expansion has provided some changes to how Financial Services firms operate:

1. Firstly and at the most obvious level, several new distribution methods have been created. The existing web channels have been improved significantly but firms have created sophisticated applications (or “apps”) for hand-held devices, smartphones, wearables, and tablets. These apps need to provide all the instant functionality demanded by the majority of customers. For example real-time balance inquiries, real-time transactions, and so on.

2. Also, these improvements in technology have allowed new innovative ways to be implemented to support existing needs. One area of particular note relates to payments where there is several new entrants have implemented products to try and improve the entire payments process. These new entrants are either existing technology firms looking to move into financial services or start-ups driving to push innovation. For example, Apple-Pay, Google-Pay, or Smile-to-pay.

3. Further to the above point, the developments in technology have allowed innovations to be implemented. In 2009, the first version of BitCoin was implemented (and this is discussed more in Chapter 11 below). Also, several crowdfunding services have been launched which allow start-ups to obtain funding quickly and directly from investors without having a bank or some other financial firm in the middle.

4. Finally, there has been an increase in standards to the point that the financial services industry can generally say it is open. The new Open Banking standard is a new technology standard that makes it easier to create interaction between banks regarding moving accounts, supporting other providers’ products, and so on. The Open Banking standard will make it easier for new entrants to enter the industry. Likewise, the Banking as a Service platform has made it easier for organizations to launch “neo-banks” or digital banks that have emerged based on improving the customer experience.

However, it is now fair to say that for the very first time in its history, the Financial Services industry is being led by the demands of its customers and clients as opposed to a set of large financial firms.

Despite the great benefits and opportunities of hand-held devices, smartphones and tablets, and World Wide Web there are many key issues.

1. For legacy or existing firms, the complex technology infrastructures (discussed in Stages 2 and 3) are becoming more complex to the point that it is hard to comprehend them. They consist of an interconnected set of different technologies, various supplier/in-house packages, a network of integrations, Internet connectivity, technology to support websites (including links to back end systems), and, now, further technology to support hand-held devices, smart phones, and tablets. Apart from the cost, staffing, and governance required to support this, the risk level (especially Cyber Risk) has increased dramatically.

2. While any new entrant will not have the issues that the legacy firms have, they will still need to build a complex technology infrastructure to support their products and clients. They will need to ensure that have the relevant staffing, governance, and risk management controls in place.

3. Consequently, Financial Regulators are now starting to take a real interest in how firms manage, govern, and oversee their complex infrastructures, especially in the areas of operational resilience (i.e., how do firms cope with outages), the management of suppliers (i.e., do firms have sufficient legal contract clauses over suppliers to ensure they perform what they have committed do?), how data is protected (i.e., to stop data breaches, etc.) and ensuring firms have named individuals who are responsible for all aspects of their business.

While these regulations are still being finalized, it is clear that failure to comply will result in severe penalties such as fines, termination of licenses, and individual criminal prosecutions of named individuals who are at fault.

4. Again similar to the above in Stages 2 and 3, there is always an issue with social acceptance of new technology. Despite the benefits of mobile computing, people are still nervous about using new technology. As noted earlier, this was especially common with online servicing because people were nervous about their details being stolen and therefore making them susceptible to fraudulent transactions. Therefore, it does often take a while for new technology to become mainstream.

5. Also, some individuals and demographic groups are nervous about using new entrants and would prefer to stick to the “tried and tested” firms. This is something new entrants need to manage and overcome.

6. Again as noted in Stages 2 and 3 above, some clients do not like the loss of personal contact. While they may like self-servicing using their phone they would still like to speak to a human about their affairs or at least be comfortable that they can speak to somebody if they want to quickly.

7. Also, not everyone will have access to hand-held devices, smartphones, and tablets, and (per above) some prefer “face-to-face” interaction. Therefore, firms need to ensure they have a personal element to the client servicing. This is particularly true for firms’ older customers who are sometimes uncomfortable with technology.

Summary and Wrap Up

The development of financial services has gone hand in hand with the development of technology generally. As technology has grown, it has allowed financial services to expand. However, financial services are always been a keen user of technology and this use has helped drive the development of the technology holistically.

This growth has been driven by several factors; namely (a) improved processing power, (b) better and more appropriate programming languages, (c) better database technologies, (d) improved usability by factors such GIUs, websites, and mobile devices, (e) improved and more widely used standards, (f ) the use of software packages, and (g) improved network bandwidth.

This growth has allowed firms and the general financial services market to prosper in several ways, namely:

It has allowed firms to generate internal efficiencies, offers new functionality, reduce operating costs, and use wider distribution channels (such as the Internet, websites, and mobile).

It has allowed firms to become more client-centric by developing products, services, and capabilities that genuinely and continuingly meet client and marketplace needs.

It has also allowed new entrants to enter the market which are looking at new innovative methods for both existing functionality (such as payments) and new functionality (such as crowdfunding). While this is generally good for customers, it can be a threat to existing firms in the industry.

However, this progress does come at some cost around the complexity of technology infrastructures.

The hardest-hit area is legacy firms that have been in the business for many years. Their infrastructure consists of overlaying layers of interconnecting sets of different technologies, supplier/in-house packages, networks of integrations, Internet connectivity, technology to support websites (including links to back end systems), and, now, further technology to support hand-held devices, smartphones, and tablets. Apart from the cost, staffing, and governance required to support this, the risk levels (particularly around cyber and operational) are high.

There is also the element of supplier risk because some of the older technology is becoming increasingly difficult to support as the software developers who developed them are retiring and young people don’t want to train in old tech.

New entrants to the market will not have this baggage but they still need to have appropriate staffing and governance in place to support their infrastructures and associated risks. However, the challenge for new entrants is establishing credibility in the marketplace and with their targeting customer base.

Consequently, Financial Regulators are now starting to take a real interest in how firms manage, govern, and oversee their complex infrastructures, especially in the areas of operational resilience (i.e., how do firms cope with outages), the management of suppliers (i.e., do firms have sufficient legal contract clauses over suppliers to ensure they perform what they have committed do?), how data is protected (i.e., to stop data breaches, etc.), and ensuring firms have named individuals who are responsible for all aspects for their business. While these regulations are still being finalized, it is clear that failure to comply will result in severe penalties such as fines, termination of licenses, and individual criminal prosecutions of named individuals. It is also expected that this regulatory scrutiny will continue with the implementation of the emerging technologies discussed in the rest of this book.

Finally, there is an issue around social acceptance of technology changes. Some individuals and demographic groups are slow in taking up new technologies which means firms will need to support this group of clients). Also, some groups are nervous about using new entrants and would prefer to stick to the “tried and tested” firms. This is something new entrants need to manage and overcome.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.19.31.73