Removing Barriers to Technology Innovation
Science, technology and healthcare have advanced dramatically over the past few decades, but there is still great scope for new innovation as technologies continue to develop. True innovation requires stepping into the unknown, and this is often limited by perceived hurdles – including tangible barriers, such as lack of resources, and emotional barriers such as fear. What can be done to help drive innovation forwards? Aside from the obvious factors, such as time, money and fresh ideas, I’d like to consider some of the influences that societal and workplace cultures can have on promoting or preventing progression. I’ve classed them into three broad areas of relevance for the life sciences and pharmaceutical industries.
Collaboration vs Competition
Many industries are moving away from closed, secretive cultures towards more open approaches that allow collaboration and sharing of information between organisations, including private companies. The common aim is to accelerate progress, such as finding new therapies more quickly through sharing academic and industrial scientific research data (eg, Cancer Research Technology’s various programmes). In software, there have been attempts to pool technical expertise across groups of developers and across industries for rapid creation of new software tools and platforms, notably the well-established Linux community and, more recently, the Open Compute Project.
This movement towards greater collaboration could be seen as very risky. It is driven by urgent consumer or end-user needs – conflicting with the usual corporate drivers of increased profit and gain of market share. Furthermore, collaboration between academics and/or companies requires sharing of data that not only gives away perceived knowledge advantages to potential competitors, but ultimately risks losing ownership of intellectual property. Why, then, does it occur? Is it the result of a philanthropic urge, or could there be advantages for participating organisations in addition to producing end-user benefits?
It seems there are potential advantages and these are emerging due to recent economic shifts. The life sciences industry, and particularly pharmaceuticals, remains permanently changed by recent recessions that have resulted in significant layoffs within numerous R&D departments, and many ongoing mergers and acquisitions. There’s less funding available for fundamental academic research and more emphasis on grants with tangible outputs. The industry as a whole is facing greater requirements for accountability, with justification of budgets through demonstrating return on investment.
As a result, many organisations lack the internal resources and expertise they need for scientific discoveries or innovative product development, which are essential to remain successful in the life sciences. Some companies can outsource or insource certain R&D projects and niche expertise, but this still requires budget, project management and building trust with third parties. The alternative is to form true collaborations that rely on different capabilities from each party to achieve the desired goals. There is no client-supplier relationship in such arrangements, and the investment can often be jointly managed, typically requiring time and internal resources as opposed to significant cash budgets. Importantly, the risks can be shared by all contributing parties.
To be successful, this model requires truly equal commitment to the project from all parties and total agreement on the desired outcomes. The priority has to be the success of the project, and this necessitates a change in employee mentality and business cultures.
Whether or not this can be sustained in the long-term remains questionable. Firstly, products arising from inter-organisational collaborations may be innovative but their profits would be diluted across different contributing parties or, in some cases, non-existent: collaborative efforts in the software industry usually aim for open-source software. Secondly, it would have the effect of reducing competition, which would not only be damaging to the economy and reduce consumer choice, but ultimately would take away the need for companies to innovate. Allowing more collaboration between organisations can be beneficial for innovation, but only when it enables true synergy.
Progression vs Privacy
The arrival of smart phones, along with improvements in wireless technologies and mobile data collection, has led to significant changes in the way we make purchases, consume entertainment, and read and engage with media. In turn this has led to large-scale developments in rapid data collection and analysis that have allowed major innovations to emerge, such as fitness bands and other wearable technologies.
These changes also offer great advantages for healthcare, opening new possibilities for automatic submission and monitoring of live outpatient data via smart phone apps. One example is monitoring blood glucose levels in people with diabetes, where digital collection and submission of patient data provides a more accurate, reliable and traceable approach than current self-monitoring methods. Similarly, these technologies hold the key to improved collection and submission of data for clinical trials, which could greatly enhance the quality of trials data as well as reducing the economic and labour burden of current data collection methods.
In countries such as Sweden, where healthcare records and drug dispensation are fully digitalised and linked with every citizen’s personal ID number, these emerging developments are becoming a real possibility. A compulsory ID card system has numerous advantages because the personal ID number can be used for storing almost all personal data. This allows reliable keeping of electronic medical records, as well as instant and hassle-free systems for numerous daily activities, from collecting loyalty points when shopping to receiving parcels, borrowing library books or hiring a car.
However, these ID numbers also hold the key to vital information such as the individual’s address, mobile phone number and even their income and tax returns. In some populations there remains a general aversion to sharing of personal data, despite the widespread embracement of smart phone technologies, and self-submission of data and content to all kinds of apps and platforms. Polling in the UK has established that the majority of Brits are strongly against compulsory ID cards, which are perceived as representing an invasion of privacy. The UK is also relatively over-populated and vital changes – such as an electronic medical records system – that would be necessary to underpin revolutionary digital healthcare innovations remain exceptionally difficult to implement. Furthermore, the country’s over-burdened mobile phone network still can’t guarantee even 3G networks nationwide, which removes the practicality of many new data-collecting technology developments. By contrast, less populated countries, such as Sweden and Finland, that are leading digitalisation of healthcare are also implementing 5G.
Digitalisation of healthcare has great potential to change the lives of patients and healthcare providers, but in some countries the decaying infrastructure combined with societal privacy concerns are impeding implementation of such innovative, life-changing technologies.
Democracy vs Decisiveness
Successful innovation across the life science and pharmaceutical sectors also depends on agility. This is essential for allowing businesses or researchers to respond to new developments, to rethink their strategies and to reshape their ideas accordingly.
Although few business decisions are made by a single person, the way in which decisions are made and information is handled varies from one organisation to the next. This is strongly related to the organisation’s degree of democracy and culture of equality. In the corporate world, it has been traditional to empower small groups with appropriate decision-making responsibilities. These groups may report directly to the senior management and the outcomes of their decisions are fed downwards through the organisation in a single-minded and relatively autocratic manner. This approach is effective and decisive, setting clear boundaries within the work environment. However, it is not particularly open or flexible for accommodating differences of opinion and, in larger organisations with long chains of command and reporting, this can become a very slow-moving and cumbersome process. Furthermore, a rigid and procedural-based mentality is not conducive to developing a creative and innovative working environment.
In some organisations, there is greater emphasis on involving wider groups in decisions. This ensures that many individual voices are heard across different areas of an organisation, and large teams can be used to discuss and finalise the outcomes. This creates a more open, democratic and transparent culture, that’s often assumed to be more conducive to creativity. In reality, too many decision makers can result in extremely prolonged decision-making that requires significant time and resources. In some cases this time and resource may be better spent simply taking the action, rather than discussing what actions to take. An agile workplace culture is vitally important for innovation and creativity, regardless of how many decision-makers are needed to purchase a new light bulb.
What other influences affect innovation and how can we remove these barriers? Contact me @kateatnoch