Friday, August 24, 2007

"Applied" versus "fundamental" research

From time to time we hear, here in Brazil, some bureaucrat (sometimes even governors and presidents) talking about "applied" (meaning, technological) research as a kind of opposite to "fundamental" (meaning, academic) research. Although this kind of nonsense is also observed in other countries (particularly I remember one of the last ministers of science and technology of the Kohl government in Germany saying something like that), it seems to be more widespread here under the equator.

In my opinion there are only two kinds of researches: good ones and bad ones. Of course some misguided minds of the academy may be doing research only for the sake of the research. This is not "academic", it is simply bad! As it is bad to do technological research funded only by government money, without a (paying) link from the industrial sector of the country. Should we do "free" research to foster our industry? This should be denounced to the WTO as a hidden economic subside!

I agree that a government should be allowed to promote research in some critical areas, like it was the case of the genoma project, but the creativity of the scientist should not be hindered by allowing only "meaningful" research to be funded.

Let us prove this by absurd, suppose we decide to fund only socially relevant science. Surely a noble intent. Who will decide what is socially relevant? It is my guess that this will be done by a committee of three to four people. Suppose the wisest scientists are chosen to form this commission (not likely to happen), even so we exchange about 10000 thinking heads by three to four! As the popular wisdom tells, 10000 heads think better than three to four.

Another reason to let the researcher do science at his/her will is the impact on the new generations. Once I was lecturing materials science for the freshmen's year of the Escola Politécnica and there was this exercise about carbon nanotubes. After the class, one student asked me who has working on that in my department. I answered that no one did and that the technology behind that exercise was so weird (it was an experiment by IBM, who built a CMOS transistor using a carbon nanotube as gate, mounting it using an AFM) that it was unlikely that this would become commercial anytime soon. But who knows? Suppose that a real breakthrough in microelectronic device technology would come by some similar technique. Should our engineers need to learn after the university what is a carbon nanotube to survive in the job market? Research in those areas, which are not directly linked to the country reality, is necessary for the sake of our students. They must be well prepared to face the competition and this can be done only by lecturers who are aware of the major developments in the world, and this is done using research.

Post a Comment