Thw Management Information Systems field is an always evolving one. One way of assessing the nature of a field is to analyse the communication system available to its scholars (Holsapple et al., 1993). Research books, journal papers, proceedings, and dissertations serve as the main interaction channels.
Some academic departments have defined lists of journal rankings. This may be an arduous and controversial process, particularly if the list is originated by internal and subjective methods. These problems are aggravated in Management Information Systems, because this field relies on paradigms and knowledge from others disciplines (Khazanchi and Munkvold, 2000). That is, IS faculties often have diverse backgrounds and diverse research interests.
Many of the theories that have been deployed in the Management Information Systems field are founded on different reference disciplines. This is also often reflected in the denomination of departments. We have departments of "Computer Information Systems", "Management Information Systems", "Business Computing" or "Business Information Systems" and more. The interdisciplinary character and relative youth of the Management Information Systems field may contribute to the variety of highly regarded journals in which Management Information Systems research is published. This range makes hard the recognition and assessment of the journals in which IS faculty publish. In this sense, studies that focus on clarifying journals quality are a valuable effort.
Numerous studies published in the academic literature address the issue of journal quality in Management Information Systems as well as in other fields. However, there is no agreement on how to best conduct the journal assessment. The methodologies proposed could be classified as either being subjective or objective (Turban et al., 2000), depending on how the information is acquired and used. The subjective approach or perception analysis approach uses questionnaires to request subjective data from scholars and practitioners.
These data are compiled and the journals are ranked based on the respondents' perceptions. Each study uses diverse models for aggregating subjective perceptions. The objective approach or citation analysis approach, determines the journals rankings based on some kind of citation counts. Information on citation impact factors of different journals is available in sources such as the Journal Citation Reports published by Institute of Scientific Information.
Journal quality is mainly based on reputation, but reputations increase and decrease based on several criteria. Robey (2002) suggests that journals earn their own reputations based on their review process among other criteria. Surprisingly, little has been done to formally evaluate the review process. This is a critical issue since it is usually a slow and delaying process. In this sense, the author thinks it is necessary an objective methodology for peer-view assessment. The review process has always been a contentious issue and intensely debated among various stakeholders.
Holsapple, C.W.; Johnson, L.E.; Manakyan, H. and J. Tanner, 1993, A citation analysis of business computing research journals, Information and Management, 25(5), pp. 231-244, 1993.
Khazanchi, D. and Munkvold, B.E. 2000. Is Information Systems a Science? An Inquiry into the Nature of the Information Systems Discipline. The DATA BASE for Advances in Information Systems 31 (3). pp. 24-42.
Robey, D. 2002. Is Information and Organization an A journal? Information and Organization 12. pp. 213-218.
Turban, E.; Zhou, D. and Ma, J. 2000. A Methodology for Evaluating Grades of Journals: A Fuzzy Set-based Group Decision Support System. Proceedings of the 33rd Hawaii International Conference on System Sciences.