30-07-2020 | | By Robin Mitchell
Continuing their trend in research, Samsung is providing $83 million for universities and students to research into display and chip technology. What resources will be provided to researchers, and what role do universities play in technological development?
As commercial pressure increases on tech companies to produce the latest devices, the importance of research cannot go understated. The COVID-19 pandemic, however, has caused a global economic downturn with mass unemployment, and shrinking revenue from businesses across the board (unless you are a car insurance company, in which case you are laughing). Such an economic downturn not only hurts employment, but it also stifles research with funding either reduced or redirected to other efforts. However, Samsung has announced that they will fight hard against the impacts of COVID-19, and have decided to not only continue their interests in funding research; they have doubled their amount from approximately $40 million to $80 million. The South Korean company is looking to get researchers in universities to help with developing display and chip technology with the use of providing funding in both cash and resources. According to Samsung, the program is not only just aimed at engineering departments; most basic science departments can take advantage of the research program.
The move to increase research into chips and display technology is not just a result of COVID-19; Japan’s increasing controls are driving Samsung into desperate waters. Currently, Japan is one of the world’s biggest providers of essential semiconductor products, including hydrogen fluoride (for etching), etch resist, and fluorinated polyamide. Its hold on the market share, coupled with Japan’s increase in more stringent checks on exporting such materials, has been causing production delays in Samsung facilities. As such, Samsung has been looking at local sources in Korea to help continue the production of products, as well as becoming independent from external countries.
It could be argued that out of all industries in existence, semiconductors are the one area that requires constant investment into new technologies involving production, and application. This is no different for Samsung, especially when considering that Samsung not only creates semiconductor devices; they also manufacture the end products that use them. Last year, this need for market capitalisation was demonstrated when Samsung announced its investment of $100 billion into logic devices that will extend until 2030. This long-term plan hopes to put Samsung in competition with other tech giants such as Intel and Qualcomm, while also investing in their foundries to produce devices for other manufacturers.
While many universities rely on grants and tuition fees as their primary source of income, other universities can capitalise on their ability to research and patent their discoveries. For example, the University of Oxford made over £24 million in non-software IP, while the University of Southampton made over £895,000. While large tech companies do often perform their research (such as IBM), it can sometimes be simpler to provide grants to universities who will select students and researchers to try and solve problems related to the field of the company providing the grant. This has the added advantage of access to those who may still be developing their skill sets, and thus approach a problem in an unorthodox manner. The grant may provide the company with the rights to the research or may provide the university with the freedom to explore with the end goal of having the right to retain the IP. Either way, universities around the world have contributed massively to the development of technology, and continue to do so till this day.
One story that I see crop up often is when scientists and researchers complain about the size of their reward from making discoveries. Such scientists can devote years of their effort into discovering a new transistor, or a method for creating a flexible sensor only to have the discovery or invention owned by the university or business they work for. From there, the IP makes millions leaving the researcher with a handshake, and if lucky, a small bonus that might be able to contribute to them getting a new car.
However, what is never mentioned in these stories is the millions upon millions of investments made by companies and universities into researchers who, more often than not, discover nothing. The scientists who do develop the next generation transistor may have worked at the university for 20 years on other research that resulted in nothing more than a 3-page research paper into crystal orientation. This doesn’t include the many years of their wage being paid for with a significant element of risk that the individual will contribute nothing to the scientific community. Large corporations such as IBM and Intel provide researchers with the opportunity to do their research with equipment that they could never afford. Long gone are the days when research could be done in a garage with a few bits of chemistry equipment. Now, next-gen research is only doable in the depths of research labs unavailable to most, funded by those who are hoping that just 1% of their research amounts to something.