Given that there exist millions of publications in the world, why is it important to personally publish 10 publications versus 100 publications versus 1000 publications representing only a tiny fraction of the published knowledge?
Because the academic world takes the number of publications or at least the impact factor as a synonyme for scientific capabilities and the anticipated success in the scientific professional career.
The motives for the researchers to maxime the number of own publications are various and different.
The motives can include promotion, support for research, scientific development, problem solving, or for being excellent.
Publishing 10 publications versus 100 publications versus 1000 publications representing only a tiny fraction of the published knowledge, is a psychological issue. All researcher tries to publish as much as s/he can, of course, maintaing for the quality of reseaches.
They are seeking for distinction in their contributions and serving the communities.
What about symbolic badges of status addressed towards non-specialists, and therefore used as information in social networks, independent from the content of publications?
To publish one's ideas is an act of communication and not only a satisfaction of increasing the number of works published. When one decides to do science he may be little worried about himself, he wants to contribute to the world, not to himself. When one starts writing he may have read lots of other papers and books. He has left his life aside in order to understand a problem which was not yet addressed at all or in the manner one decides to explore it. People who are writing anything are seldom practicing science. If pride, if the love for himself were his main motivation, he would not stop taking time for himself. A person moved by pride, may use ideas of other writers as if they were their own, but this is not science.
There are good and bad reasons behind it!
Beside the good points mentioned already I can think of a few too! It maybe lots of reasons: spreading of good documents, duty, reputation, ... or may it be due to academic fulfillment (promotion, getting grant, support for research, ...)
Here research funds distribution is based on the faculty performance (book publishing, thesis supervision, paper publication, conference attendance, ....). So in order to get more money in our credit card, and hence, be able to supervise new students, to do research and to give financial support to our graduate students we need to make grant! So we accumulate as much for such expenses. Very simple. we publish more. To get more grant we should be able to publish in high IF journal.
Because the academic world takes the number of publications or at least the impact factor as a synonyme for scientific capabilities and the anticipated success in the scientific professional career.
I do not think that scientists wish to maximize the number of own publication; only mini-scientists wish to maximize the number of own publication.
In addition to the good reasons expressed already, I need to communicate my creative thinking which may be useful for progress. Publishing papers is the platform I have for validating my mental work and getting recognition which I think human beings need.
Dear Alexis,
thats an important aspect. But this doesn´t mean, that you have to increase the number of your papers, you must improve the quality and look for scientific feedback.
The motivation of the academic world is very individualized, but for those dedicated to integrity and truth in science, peer reviewed publications and their numbers in higher impact factor journals are not only a way to demonstrate scientific capabilities but can change dogma and establish new truths, if possible. The scientific professional career is tenuous at best in today's society but, we now have research gate too as a potential bright spot for young investigators and a place to do things that were once only the bastion of the publication media.
There are varied reasons behind the tendency of scientists to maximize their number of own publications. A common reason of particular concern is the growing competition for research funding and academic positions that usually combined with an increasing use of bibliometric parameters to evaluate careers (e.g. number of publications and the impact factor of the journals). Competition is encouraged in scientifically advanced countries that pressures scientists into continuously producing publishable results. Also, the “publish or perish” culture in academia conflict with the objectivity and integrity of research, because it compels scientists to produce publishable results at all costs. In many areas of research, papers are more likely to be published, cited and e accepted by high-profile journals, if they report results that are positive. Papers are less likely to be published and to be cited if they report negative results. Such compulsions conflict with their objectivity and integrity that lead to scientific bias.
Dear @Marcel, I do not wish to maximize the number of my publications, but to increase the practical applications of my research! It counts, right!
Dear @Abedallah, I do very much agree with your comment!
I agree with Hanno. It is not the number of publications, it is the quality of publications. Some authors publish quality publications (well-written, important information), but within a series of publications there is little additional information in each succeeding publication. Each succeeding publication cheapens the total because insufficient information is in each. One excellent publication is better.
The current publishing system invites multiple publications, too many publications, too many pages to fill. Often journals will require that a long article be broken into two or more articles. This journal system is fed by the academic illness of publish or perish.
Bound journals limit improving a given publication and incorporating feedback. Online publishing could allow 'living' publications where the author could alter and add to a publication. Data sets and experimental descriptions could be added, greatly improving the information in a manner that is not possible with journal publishing.
It is important to publish. It is how we are judged and become known. It is better to be known as a quality scientist, than a scientist with a lot of publications.
Each publication may be a tiny little step towards disclosing secrets, solving problems, improving knowledge and acceptance and meeting with universal approval. Operating experience shows: the more publications the larger responsive audience.
I agree with Hanno too: quantity and impact fator are able to "open doors", Recognition is a key elemento to positions, opportunities, and funds.
The purpose of publishing a manuscript is to communicate your message to the research community. and take part in the research map of the society which will reflect on personal education .
Pression from outside should not be understood as scientists desire to increase their writing production. I agree with Abedallah, only the little ones take that road.
When an authority in your university, , for example, department of research has a limited fund and want to know about your scientific capabilities and distribute the fund according to scientific capabilities, how should they decide? They look at your already performance, H-index, patents, ... Should the fund be distributed equally to all for the sack of justice? Should the fund be wasted on some without ability to write and publish his/her finding or conducting a proper research activity? The answer is probably, a big NO. So, there should be a proper metrics to measure your scientific capabilities? I dont think calling someone "little-ones", "long-ones", "mini-scientists", "maxi-scientists", etc. are good metrics or make us good or bad scientists! Diligent, hard-working and done with painstaking effort, persistence, beautiful-mind, creativity, active role in community, .... are better way to compare different people. The number of publications dont matter what matters the ability to publish. You've probably heard the term "put your money where your mouth is", meaning back up what you say and don't just give out advice that you don't follow yourself, or spew an opinion that you can't or won't stand behind. However, I think there is a key word that needs to be replaced in this phrase - money doesn't back up what you say; action does.
Dear Mahmoud,
I understand the problems with fund distribution and agree with your argument, not to spent the fund in equal parts for all in order to be fair. The question here is, is the number of publications a better measure or should one find some more appropriate "algorithm" to evaluate the scientific quality.
Dear Hanno:
I answered your question in my first attempt. I said "Here research funds distribution is based on the faculty performance (book publishing, thesis supervision, paper publication, conference attendance, ....)." So as you can see here (in Iran) it is not just publishing more paper but rather it is a package. So It is more justice if we evaluate the overall performance of academics by including more items than I wrote inside the parentheses above.
Due to get promotion in professional carrier in time and to establish his credentials in scientific world. Thanks
Dear Mahmoud,
you are right to look at all the other performances you cited. What I´m still missing is an additional and reliable "quality factor" independent from some countings.
Dear Hanno said "What I´m still missing is an additional and reliable "quality factor" independent from some countings. " Can you give a hint about Q-factor?
Dear Hanno,
I appreciate your comment with which I agree fully. Personally, the number of papers is not what drives my scientific activity but the pleasure I have trying to find novel solutions to key problems in my research field. Publishing papers develops as a natural output, which has a positive impact on my activity for which I need recognition for happiness.
Q-factors: Publications that become text book examples? Publications that are frequently discussed during university education courses? All this is ignored in impact factors?
Dear Marcel:No they are not ignored! These high quality works are reflected in another metrics called H-factor. The higher H-factor means the more useful to others or the more frequently discussed everywhere (including university education courses).
@ Hanno & Mahmoud - Mostly three bibliometric indicators based on journal impact factors, citations and journal ranks are used to assess the quality of research outputs worldwide. However, strong arguments have been moved on both for and against the use of impact factor as a measure of research quality. At the level of individual papers the three bibliometric measures are fairly blunt instruments as tools for assessing research quality.
All researchers try to publish as much researches as they can. But the number of researches should not be on the account of the quality of researches.
Most universities, I believe, encourage the joint researches and assign the same weight in promotion systems for the first and all other authors, regardless who is the first or the second or the third author.
Dear Marcel,
The answer of Hanno cannot be surpassed regarding mainstream conditions. Theoretically, it is possible that some of us have excellent and very new ideas and results which should be published immediately. (:-)
Dear Marcel,
Dear All,
Vanity and money make the world go round. A long list of publications (some or many of them may be the result of working and efforts of others) can contribute to the illusion being a “big potato” in science as well as to become a boss with a good remuneration.
The causes of maximizing the number of publications are different for each researcher
Dear Andras says (A long list of publications can contribute to the illusion being a “big potato” in science) what a nice analogy by an intelligent scientist! How about this analogy "empty vessel makes more noise"
Even though recognition is one of the reasons, professors and scientists would have diferente reasons to do so. For instance, professors (depending on their institution policy) might get rid some class hours due to a certain amount of "good" publications: it's a good deal...
I believe publishing papers in good journals and conference is an art and as a researcher/academician your priority should be to get some good results and take the credit by publishing the same. In addition, well cited publications will definitely fetch you a distinct recognition in your domain, which also acts a booster for your future en devours.
I like very much Marcel's question. And yet, I like even better its justification.
In this sense, I firmly believe that a sound strategy consists in writing in publishing various articles and papers, f.i., aimed a different journals and magazines. Some of them, of course, highly academic and scientific; but some others for a more general audience - always taking care about the fact that they should be prestigious journals, media, etc.
As scientist we should be able to communicate with our peers, but also with society at large. Thus, a (as much as possible) parallel and distributed strategy does help, I believe, both knowledge, innovation, society and even we ourselves.
I liked what Carlos said about publication strategy but It does not make a sense for me to discuss about pubications in terms of numbers. Those two directions you mentioned (scientific and popular) are reasonable and fits well to what we researchers should do. Knowledge should be shared between us - the researchers and general audience. Why to care about number of publications? In the end the market will verify those who does not have anything interesting to say.
Dear Mariusz, thank you for your comment. I have never talked about numbers, though.
In fact, I do practice my own strategy - just suggested. Sorry for
this; it pays back...
It's simple misunderstanding :-) Maybe I should have been more clear. Talking about numbers I did not refer to your answer but to the problem posed in the question that initiated our discussion.
Dear Mariusz, never mind. Moreover, I do agree with you! Reducing the point to quantities impoverishes the issue. However, I guess it was meant as a general expression...
Dear Roland thinks "most of humans do have EGO". To be able to publish something worth reading in a reputed journal you must pass lots of posts: 1) do a decent research, 2) analysis the results, 3) add discussions, 4) prepare and write the manuscript, 5) submit it to a good journal, 6) get feedback from referees, 7) make lots of corrections, changes and rebuttal to satisfy all the referees, Provided you pass all these steps, you must wait another 6 months to 1 year for its publication.
I think passing those steps requires something more the EGO. In my experience, It probably requires something more, like, knowledge, intelligent, capability, diligent, being able to do-it-yourself, be humble, honesty, wish for some luck, .... If you call all these EGO, then add me to the most of humans. Otherwise, what should we do to those who are starving and can not be fed!
Should the number of own publications be maximized or optimized?
Dear Marcel,
could you help me, what is an optimized number of papers?
Hi
each day, sth new will discover in scientific world. as a researcher, i want to know what is the method differences in work and in results. when a researcher finished his thesis about his/ her favorite topic he/ she wants to demonstrate it to other scientists.
when a rsearcher is young he/ she wants to maximized his/ her publications to show him/ herself as a talented person, but with time, she wants to prove sth good for a better scientific society and optimized his/ her publications with more patients.
No @ Marcel M Lumbrecht it can not be. And it should not be. Bcoz good research should always be continued and its publishing numbers can not be fixed in. Limits of maximization/optimization.Thanks
Optimizing is maximizing the difference between benefits and costs of publication. Possible benefits and costs of publications have been mentioned already.
Some benefits: Past work that is made visible and verifiable by others, Progress of knowledge, intellectually occupying other people for years, etc...
Some costs: Publishing results of hasty science practices, like (unverifiable) errors; complication of the identification of reliable versus less reliable published results, intellectually occupying other people for years, etc...
Dear MArcel,
thanks, but I´m not sure that young scientists, who want to get known, will ask for your type of optimizing, they want to be present.
Dear Hanno,
of course. Scientists/commissions/referees/editors get wiser when they get older, or not? Perhaps there is more pressure on students to perform, or not? Perhaps there is more pressure on scientists that have to handle huge budgets to justify the budgets, or not? All human factors.....
As it is clear by now, that the impact factor isn't a valid parameter for individuals, many organisations use the H-index.
And this index will only grow if you publish a lot. If you have only one but highly cited publication you will stick to 1 as your H-index.
Dear Birgid,
Given the millions of ideas circulating, it is already very nice if a scientists can make one significant contribution to science/education. I therefore think there is also a problem with the H-factor.
Dear Marcel,
sure!! But as long as decision makers like it easy ....
They just look at amounts without even truly knowing the contents of what they have to judge... All human factors again....
Optimizing is maximizing the difference between benefits and costs of a commercialized product. Possible benefits and costs of commercializing the product are
Some benefits: Progress of the quality of life, intellectually occupying other people for years (culture addressed to a wide public expressed in different forms, including books with accessible contents, etc...
Some costs: Commercializing results of hasty science practices, like (unverifiable) long-term environmental consequences; complication of the identification of reliable versus less reliable long-term effects, intellectually occupying other people for years, etc...
I am confused by now! Till now I thought when we find something new during a research activity It is beneficial to publish it for other scholars (scientific community) to read it and use it in their own research. Some people here argue that we should not publish too much otherwise we will be accused of many things. Our job is like what doctors do! How many patients should your doctor see each day? some see no one some see a lot. It depends on may things including skill, capability, ... How many papers, patents, ... should a faculty member publish during his/her 30 years life in the university? It depends on may things including skill, capability, ...
Dear Mahmoud,
Optimization can be defined at the within-individual level. Individual optimizations assumes that the optimal number of publications will be higher for some scientists than for other scientists, e.g. because of biology-based reasons (e.g. the amount of hours requiring sleep). For a given individual A, the optimal level may also change across research environments. For instance, in a richer research environment A, the optimal number of publications of individual A will be higher than in a poorer research environment B.
The problem is that judges/commissions do not have access to the potential costs and benefits of publication efforts of individual scientists in relation to her/his local environment. Perhaps some scientists are forced to publish more than they can, i.e. to publish more than the individual-specific optimum in a given environment A.
OK?
Dear Marcel. Answer can be OKAY (Yes) or not OKAY (No) based on how we define "Individual" . The past is gone so we must re-start a fresh! In fact due to more chances of collaborative research these days as well as the multi-disciplinary nature of many research topics or subjects, we can do better than before and we can even publish more than before (not as the aim but if that is the case). So what matters most is not individual anymore but rather being able to work as a team work. However, the individuals are different (as you related them to some biology-based reasons). Beside your reason, there are also many other good reasons (some genetic or inherited, some acquired or gained) that I explained already in my previous posts on this thread..
The main reason of this thing is that-the evaluation system in this advanced world still put stress on the number of publications one has instead of the quality of those. Even the scientist with 100 papers get stronger respect than those having only 5 high quality research articles.
Excellent remarks concerning individual-based publications versus group-based publications. Do you think that the benefits and costs differ between these two publication strategies?
Potential benefits: Collaboration, sharing,..
Potential costs: Competition, arms races...
I think that scientists publish a lot in order to best tell the whole world about the results of their research
Dear @Marcel, @Stefan has edited his answer!
Dear @Eraldo, you are right. The importance of competition between researchers is very good issue! It brings the quality to the science!
Maybe it should be pointed out again that number of publications is not of the primary importance, as the quality of papers is.
@Stefan,
I´m not sure that your optimistic view is ok. A lot of "scientists" publish to promote their career. But I would like to prefer your opinion.
Total number of papers does not account for the quality of scientific publications. I think that more important than number of publications is the h-index.
Dear Stefan,
I repeat Marcel´s question about H-index. Do you have detailed infos?
How h-index is defined by Wikipedia: The h-index is an index that attempts to measure both the productivity and citation impact of the published body of work of a scientist or scholar. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications.
What are the alternatives? G-factor, RG score, ... Even h-index has some problems and limitations of its own, still it is much more reliable than RG-score and other metrics! For example, for increasing your RG-score you dont need any publication, productivity and citation impact, ..., all is needed to increase your score is to get more up-votes when answering the questions even the answers are largely second hand, i.e., "Copy & Paste" through simple Googling. So, the quicker you search the net the more you get up-votes and the faster goes high your RG score. The is a kind of productivity! That is why even the Nobel Prizes winners in RG have RG scores of less than 30 or 40!
http://en.wikipedia.org/wiki/H-index
Do high citation scores used to calculate the h-index reflect quality of research? Some elements for more discussion:
The major problem with science citation indices (e.g. JIF, H) as a tool judging science quality is that they ignore underlying (biology-based) causes of citations. Reasons of citations are often unidentified. Well-written papers that contain scientific errors might often be cited, which indicates that citation rates do not necessarily reflect science quality. It also would indicate that people often do not carefully read articles they cite. People might cite others to improve social or scientific integration, for instance to increase the probability to get jobs in a given research domain or simply to support colleagues or friends not taking scientific activity into account. Publicity or other marketing strategies, independent from science quality, might also influence publication citation rates. Perhaps there is a positive relationship between the number of times a paper is cited and the number of times the content of the paper has been presented at scientific meetings as oral presentations or posters. Perhaps politically driven internet might expose some scientific publications more than other. People might also cite a paper because scientific importance is estimated via the number of citations from others (cultural copying). Perhaps there are collaboration networks to cite publications from friends or close collaborators with some people supported by larger collaboration networks than others. Thus social integration of authors or teams might influence publication citation rates. Should reasons for citation be more clearly defined? This is easier when a paper is cited, but not easy when a paper that has been read is ignored. For instance, convincing critical papers might not be cited because they are embarrassing readers or colleagues. Papers that publish complex scientifically-correct topics might be rarely cited simply because few readers truly understand the content. Do colleagues, referees or editors have the social power to promote citation of their own work? Perhaps referees occasionally impose to authors to cite referees’ work in manuscripts submitted for publication to a journal. Thus publications might be cited or ignored because of historical, social, competitive or purely scientific reasons.
Dear Marcel. Did you know Thomson Reuters Master Journal List currently has a total of 17011 journals (WOS, JCR): How many of those are within your speculation (I guess this is your concern putting too many "perhaps" in your remark)? With a little inquiry one can find his/her way. Even there are a few fake journals, still there are a lot left for consideration. Another point is that the list is not static and they revise their citation and records and remove the fake ones while promote the good ones (in ranking). So actually we should not be worry much.
Marcel
Regarding your answer
Some examples
Dear friends, if you allow me I'd like to suggest a different take (not that I believe it, myself).
Scientists -poor them!- are in a sense obliged to maximize, indeed their publications - as much as possible.
Publications, impact, and success do have, to some extent, a certain element of randomness. No one is certain, and certainly not ex ante, a bout the deep success of a paper or a book. Contingency pervades the life of the mind, too. Hence, generally speaking, a scientist publishes one and another, and one more text hoping that any one of them will say: "bingo!". And, so to speak, he or she will touch the heaven with his/her hands.
Maximizing (= the number of own publications) is a way to bet for a sort of lottery. If this makes sense, then the effort of so many scientists is well justified.
(Of course I do mean the life and research of a scientist can be just reduced like this).
Dear @Hanno,
I think that the @Mahmoud answer explains the h-index. More information can be found at http://en.wikipedia.org/wiki/H-index
Some more detailed information on H-index are attached! "Scopus and Web of Science collect and organize citation counts and can calculate an individual’s h-index. Likewise, Google Scholar collects citations and calculates and author's via Google Scholar Citation. However, each source may determine a different value of the h-index for each individual. Sometimes the variation in the h-index between sources can be large."
Dear @Hanno, many resources are available now, but many questions do still exist!
http://images.webofknowledge.com/WOK46/help/WOS/h_citationrpt.html
http://subjectguides.uwaterloo.ca/content.php?pid=84805&sid=1885850
https://www.hsl.virginia.edu/services/howdoi/hdi-wosh-index.cfm
http://wokinfo.com/citationconnection/
Dear Ljubomir,
thanks for the lot of resources, I´ll try to understand the measure and the relation to the scientific quality of the authors.
Just another dimension to add. Outstanding scientists might work for at least one part of their career in private companies protecting information not accessible in publications. Will the perceived qualities of these scientists be underestimated with accessible h-indexes? Perhaps these scientists maximize/optimize inaccessible activity reports with unique findings/science stories? Should private (activity) reports be taken into account, but how?
Dear Marcel,
I have no clear answer to your question, but some historic reminiscenses, just remember Los Alamos and nuclear physics without any publication for years.
Good point, dear Hanno. In fact those scientists who are hired in such institutions know in advance that such are the rules. The same example could easily be mentioned also regarding other governmental and corporate institutions.
The very game is quite different there. And in there, everybody knows that.
Dear Carlos,
I think that´s the price for working under financially optimized conditions.
Yes indeed. Moreover, dear Hanno. I love thinking about such circumstances in Goethe or Thomas Mann (= Faustus). Because it is literally a way of "selling your soul". Seriously. No metaphors… Frightening, isn't it?
Dear Carlos,
Different topics may have the same education value, but may differ substantially from a financial/logistic/... point of view. Some projects may require only a couple Euros/Dollars/... (e.g. a bird study) whereas other projects may require millions of Euros/Dollars/... (e.g. placing a robot on a space object). Pressures to produce results as reflected in the number of publications/ patents/... therefore should vary according the financial support, or not?
I do understand your point, dear Marcel. Moreover, I agree with your concern, absolutely. The point rather is, following Hanno's remark, that there are some circumstances where you are forced to play the game. regardless of the number of publications, patents, etc.
Besides, there is a fine point about your post. It is the importance or need or urge to highlight your own name. (This has certainly much to do with the h-index issue). In contrast, those scientists that work for Corporations or special State agencies, are obliged to play a game that entails a highly kind of anonymity. Vis-à-vis our western tradition, that's a high price to play, isn' t it. Of course, the reward is that you are very high paid...
(Plus: remember Dr. Faustus…!)
Dear Ljubomir, The different values of the h-index for WOS, Scopus and Google Scholar is true. However, WOS has the lowest (more correct and realistic). Scopus is very close to WOS (as it records ISI all databases, too). The number calculated by Google Scholar is highest. Google Scholar uses all publications by the scholar citations to arrive at h-index. WOS only consider the top journals (ISI with IF).
A very accurate distinction, dear Mahmoud. It is so, indeed.
Should citation numbers be maximised or optimised? The h-index seems to go for maximization of citations focusing on a couple publications. However, I presume that in a research field like specialized mathematics there are few researchers that can read and understand top articles in specialized mathematics? Research fields with smaller specialized science-based communities therefore will probably be cited less often. Therefore, citation rates become disconnected from research quality, or not? Do the existing citation/h-indices ignore this?
Dear Marcel,
Currently, h-index is the most objective value out there as it reflects both a researcher's number of publications and number of citations per publication (a measure of the publication's quality)..It provides a unique and clear view on overall performance of how a researcher' is performing over time. if you are looking at tenure, for university faculty recruitment and advancement, award of grants, etc., you'd want to see if the applicant is consistent, is the h-index. Although, it is not the ideal metric and falls short in many situations (as you mentioned), nor it accounts for the possibility that some collaborators may have contributed more than others on a paper, yet it is currently the most common measures to look at .
Dear Mahmoud,
Accepting that all research fields have the same scientific value, I would have added a correction factor that would take the research field into account, which could be expressed as a deviance from an average value per research field, or not? These research fields are clearly defined in ISI Web of Knowledge/WOS.
Unfortunately it is easier to measure quantity than quality. Some faculty also become expert at publishing the same information in different journals. The academic world should look at the reward structure when it comes to publishing. Publications and presentations should show a body of work developing a depth of understanding of a field of research. There also should be a limit on how many authors get to put their names on an article. If someone is #10 on the list of authors, how much if at all have they contributed? Yet it is counted as a publication.
Dear Madeleine,
If you want to contribute to a large-scale longitudinal study, you may require a lot of collaborators that put a lot of effort in data sampling, literature research, etc... If they contribute there is also a sign of consensus based on all these experiences. Why is it wrong to express consensus concerning a specialized topic?
On what criteria would you include versus exclude people for authorship?
When an approach is multi-disciplinary should each contributor in a publication master all the techniques/literature concerning each discipline? Do you think this is the case for research directors/professors that are overloaded with administration/lobbying tasks?
Given the remarks mentioned above, I would propose that each detail (including the data sample procedures and data) of the content of a publication should be mastered by at least one of the contributors.
Your question about what criteria should be used to be added as an author is an excellent one. In most cases, there is no criteria. Some journals do limit the number of authors. There should be some criteria that involves active participation in the research and/or writing process of the study and article. I've seen department heads add their names to articles without ever reading the article.
I've also never been clear on faculty adding their names to publications of their students. Those of us who have taught or continue to teach research have many students whose projects get published. Should faculty names be on those publications or not? What do you think?
Dearِ Marcel propose "the content of a publication should be mastered by at least one of the contributors." You are right and I believe that is the case actually. Otherwise how one can respond to tough referees rebuttals. Anyway, we have co-authors it means we needed extra co-author for something beyond our knowledge. However, for instance doing some bio-medical projects and you are an electronic engineer, so you need advise from doctors. In such cases your proposal wont work. Finally having many co-authors means some real collaboration and is not a bad thing. The " Observation of a new particle in the search for the Standard Model Higgs boson with the ATLAS detector at the LHC" has 3,171 authors (IF= 2.679) or the paper "Initial sequencing and analysis of the human genome" (Nature V412, 565) has about 2900 authors (IF= 42.351). ..
Observation of a new particle in the search for the Standard Model Higgs boson with the ATLAS detector at the LHC, Physics Letters B, Volume 716, Issue 1, 17 September 2012, Pages 1–29
Dear Mahmoud, I might be wrong, but I don't think those 3,171 (or 2,900) individuals can be considered "authors". Have, each one of them, writen anything? I'd say a list a list of collaborators is something totally diferente. Going beyond, I'd say that every author is a "collaborator", but not every collaborator is na author.
Dear Henrique Good points, but why not? Is it criminal act? They found (discovered) something in collaboration and published it together. So "writing anything" does not what an author must do. Even ideas can be patented nowadays. I wanted to demonstrate the possibility. Both papers I mentioned ARE papers not ordinary documents or research reports. You can check them in the journals I mentioned. These are rare cases bu nowadays due to international collaboration and multidisciplinary nature of many project the number of co-authors has risen too.
Hola Marcel, personalmente publicar es mantener un estatus científico aceptable. Por otro lado, para cuestiones de trabajos y contratos, tu puedes ser un científico con una capacidad excepcional y ser muy creativo, pero si en la comunidad científica nadie ha leido tus escritos o pensamientos, puedes ser maravilloso, pero queda oculto.
Personalmente, para mi es mejor publicar 10 artículos bien hechos, y entendidos por los revisores y la comunidad científica en general, que no muchos y publicar por publicar.
Saludos.
Dear Francisco,
I do not understand written Spanish, but think to understand some words. I guess other participants know in detail what you wish to say.
Students that learn English and have to read an English publication will probably start with a dictionary translating individual words, and then come up with an interpretation of the meaning of the sentence. If a translator is doing the job, how do these students know the translation will be the good one? In these conditions, how to produce publications to reach a high h-index? Is language competence a key to scientific success reflected in a high H, as discussed elsewhere?
Should the h-index therefore also correct for the country of publication (native English speaking versus non-native English speaking)?
Best regards,
Marcel
Dear Dr.Marcel,
Without technical analysis, just my opinion, own career often requires the production and dissemination then improved way is to publish more.
Or simply (only in some cases!) because is common to think that quantity is better than quality!
Best Regards,
Vanessa
Dear Marcel, I like that along many posts here on RG you stand out by asking (in my opinion) sharp and suggestive questions. - As you know, there three general ways for increasing your h-index thus: via WOS, via Scopus, and via Google Scholar. Although different, there imply a diversified strategy.