Advantages - University of Southern California



On the Relationship Between Citations and “Top 25” Downloads in International Journal of Accounting Information Systems

Words = ~ 4000 Original

Words = ~ 6600 Revised

Words = ~ 8000 Revised

Abstract

Citations from existing sources like the Social Science Citation Index (SSCI) have long been used to evaluate impact and importance of research. However, in addition to SSCI becoming digital, additional citation sources have been developed, such as SCOPUS and Google Scholar. Further, information now is available regarding papers that are among the most frequently downloaded, providing a new potential measure of impact and importance. This paper analyzes the use of digital and web citations, and newly available digital download information in the form of “top 25” most downloaded paper lists for the International Journal of Accounting Information Systems.

This paper finds that any of the number of three types of “downloads” (abstract, paper and denied accesses) are highly correlated with each other. Further, in an analysis of the International Journal of Accounting Information Systems papers in “top 25” downloads, this paper finds that the number of citations and the number of times that a paper is in a “top 25” of downloaded papers are statistically significantly correlated. Finally, this paper finds that the set of “top 25” downloaded papers has a disproportionate number of citations.

Acknowledgements

An earlier version of this paper was presented as part of a panel discussion at the American Accounting Association meeting in Hawaii in August 2003. A revised version was then presented as a keynote speech at the NBES (National Business and Economics Society), Honolulu, Hawaii, March 2004. Most recently, a further revised version was presented at the American Accounting Meeting in Chicago in August 2007. The author would like to thank those attending those presentations for their comments. For this most recent version, the author also would like to thank Steve Sutton and the referees for their comments on earlier versions of this paper.

Keywords

Downloads, Citations, SSCI, SCOPUS, Google Scholar

1. Introduction

Research paper importance, usage or interest is often equated with citations (e.g., Garfield and Welljam-Dorof 1992). Historically, citation analysis required obtaining book versions of the Social Science Citation Index (SSCI). This meant trudging across campus to the library to try and establish citations.

However, the world of publishing and citations has become digital. Recently SSCI information became available in digital format on the Internet, as the “ISI Web of Knowledge.” As a result, information about citations is easily and relatively universally available. In addition, many journals are now available in digital format on the Internet. This availability has made it possible for digital citations from other citation sources, e.g., SCOPUS. Further, research papers are increasingly available on the Internet. Individuals and publishers post digital versions of papers to the Internet so that others may have easy access to them. As a result, that has opened up the possibility that citations could be gathered from papers available on the web with the introduction of Google Scholar Beta.

The movement to a digital environment also has also opened new opportunities to assess paper usage and importance. In particular, information about the extent to which a paper is downloaded can be gathered. The number of downloads or the appearance of a paper among a set of top 25 most downloaded papers can provide another measure of importance, usage or interest.

As a result, ease of access to citation information has increased and we have new information about paper downloads that can be gathered and used to evaluate the use and importance of the paper. Unfortunately, it is not clear how different versions of these citations sources and downloads relate to each other. For example, what is the relationship between the number of citations gathered by SSCI, SCOPUS and Google Scholar, or what is the relationship between the number of citations of any of those sources and download information? Further, are there more or less citations associated with top 25 downloaded papers?

Downloads and Citations

Both download and citation information provide measures of interest, use or importance of a particular research paper. In a digital environment, downloading is a step that typically precedes citation. Researchers download a paper. Then the paper would be examined and then possibly partially or fully read. If the paper is important to their research and used to facilitate the research paper, then the downloaded paper would be cited.

Downloading takes time and effort, and potentially resources, to pay for the digital downloads. As a result, downloading a paper suggests that the downloader has an “interest” in the paper. That interest may have been established for any of a number of reasons. For example, the downloader may have thought the title was interesting, or known some previous research by the particular author that was of interest. In addition, that interest may have been established because other researchers cited the paper. However, although downloading a paper may provide a measure of interest, downloading does not necessarily mean that a paper was used or read or cited in support of other research or argumentation.

On the other hand, citation analysis is more likely to indicate actual use of the research by other researchers. Citations indicate that an author or authors found the paper important in establishing precedence, an argument, a methodology or other sets of concerns that were used in their paper.

Purpose of this Paper

Accordingly, the purpose of this paper is to explore these emerging issues of digital citation and downloading papers in the context of accounting information systems. At the most basic level, this means investigating issues such as “what is a download?” This paper also investigates some characteristics of download information, such as “recentness affects,” concerned with when the papers appear on “top 25” most downloaded paper list relative to their publication date. In addition, this paper analyzes the relationship between the number of times a paper appears on a top 25 downloaded paper list and the number of citations: are there a disproportionate number of citations in top 25 downloaded papers? Finally, this paper investigates the relationship of the number of citations between three citation sources for papers among top 25 most downloaded.

Scope of this Paper

There has been limited citation analysis of research about International Journal of Accounting Information Systems (IJAIS). Further, much of that research (e.g., Hutchinson et al. 2004) employed citation analysis that was specifically generated for the purpose of the paper, and did not draw on digital sources or widely available citation sources (e.g., SSCI). In addition, since that paper there have been a number of developments in the area of downloads and citations, such as SCOPUS and Google Scholar. As a result, the primary focus of citation and download analysis in this paper is on IJAIS.

Outline of this Paper

This paper proceeds as follows. Section 2 summarizes some background on citations. Section 3 discusses different sources of citations, including the Social Science Citation Index and “Google Scholar” a tool that allows determination of citations from papers posted to the Internet. Section 4 investigates downloads and discusses downloads in comparison to citations, and discusses advantages and disadvantages of using download statistics. In addition, section 4 analyzes the relationship between different kinds of downloads, e.g., downloading abstracts or the complete paper. Section 5 analyzes IJAIS explores the year of the papers that are downloaded based on “top 25” downloaded papers and finds a “recentness affect,” in that more recent papers generally are more likely to be downloaded. Section 6 discusses generation of data about those papers that were listed on a top 25 most downloaded list and the number of citations to those articles. Section 7 analyzes that data, investigating the relationship between those appearances on top 25 lists, and citations and time. Section 8 explores tensions between Google Scholar and “top 25” download lists. Section 9 investigates some extensions. Section 10 briefly summarizes the paper.

2. Citations

This section briefly reviews the use of citations in the previous literature. Discussing citation issues is a topic of broad based journal interest. For example, papers focused on or investigating the topic appear in such far ranging journals as The Journal of Bone & Joint Surgery (e.g., Kurmis 2003), Science & Public Policy (Garfield and Welljams-Dorof 1992), The International Journal of Accounting Systems (Hutchinson et al. 2004), Journal of Finance (Borokhovich et al. 2000) and College and Research Libraries (Casserly and Bird 2003). Virtually every discipline is considering issues relating to citations.

Citation Information

Citations to previous literature provide an indication of the importance of that paper in the development of subsequent papers. Citations reflect building of a literature over time. As a result, citations of papers provide a measure of the importance, interest or use of the paper. Accordingly, it is important for a faculty member’s research to be cited.

Thus, citations are embedded in day-to-day faculty life. Faculty can or may be required to disclose their citations as a part of yearly performance evaluation information to facilitate an understanding of what they have done over the past year. In the case of promotion and tenure, universities frequently ask other faculty members whether the faculty member going up for tenure or promotion has had an “impact” on the field. Citations are one approach for measuring impact. Universities periodically conduct departmental reviews. Typically, such reviews include citation analyzes. Citations have been used to compare research productivity of individuals and academic programs. In addition, citations also can be used to trace the history of particular streams of research (e.g., Robinson and Adler 2003). Garfield and Welljams-Dorof (1992) have suggested that citations can be used to rank not only researchers or academic programs, but even Universities and Countries. They then proceed to illustrate areas where such applications can be made.

Importance of Citations

Garfield and Welljams-Dorof (1992) note that citations provide quantitative data that allows indication of the relative impact of different journals, individuals, departments, institutions and ultimately nations. They also note that citations can be used to identifying emerging specialties, fields, disciplines and sciences. Further, as noted by Diamond (1986), who addresses the marginal value of a citation, citations can have an economic benefit to those cited.

Smith (2004) addressed an interesting issue trying to understand “what is a top paper, as opposed to a paper in a top journal,” focusing on publications and citations in finance. Smith examined a number of criteria that would be used to defining a top paper, all based on different criteria relating to the number of citations. Ultimately, Smith (2004, p. 148) suggests that “… the citation list may be a better reflection of an individual’s impact on the literature than that person’s publication list ….”

Bibliographic and Web Citations

Web citations have been proposed as a replacement for bibliography information. Vaughan and Shaw (2003) investigated the relationship between Bibliographic and Web Citations. Unfortunately, there are limitations associated with web-based citation availability. For example, Casserly and Bird (2003) examined 500 citations to Internet resources and found a majority had partial bibliographic information and no date. Few journals provided any instructions on citing the resources. Parker (2006) and others have found an increase in the use of on-line resources, but that the availability of Internet resources decays over time. In addition, she found that rate differs in different disciplines, but that there is increasing concern over the impact of that decay on academic research. However, web citations form a critical core for the emerging product, Google Scholar.

Limitations of Citations

A number of researchers including (Davis 1998) have discussed limitations of citations. For example, using citations to rank journals is really not appropriate since different journals have different products, and thus such comparisons are like comparing apples and oranges. Another limitation is that survey papers are likely to be cited more than other types of papers, potentially encouraging survey papers. As noted earlier Garfield and Welljams-Dorof (1992) suggest that there is a long lag with citations. However, to-date citations remain an important quantitative measure of research importance. Further, citation sources typically focus on a particular set of journals or publications, potentially limiting the number of citations from particular sources or even subject areas represented by the excluded publication sources.

3. Citation Sources

There are a number of citation sources that are widely available, including Social Science Citation Index, Google Scholar and SCOPUS.

Social Science Citation Index

The primary source of citations for citation analysis has been the Social Science Citation Index (SSCI), also known on the web as the “ISI Web of Knowledge.” SSCI captures all of the citations from a set of journals, whether or not the cited papers are included in publications indexed by SSCI. It is relatively easy to determine all of the citations to particular publication, such as IJAIS, or author. Only articles that are cited are included in SSCI.

Discussed by Howitt (1998) and others, SSCI has a number of advantages. First, SSCI is available on line and easy to use. Users can specify an author or publication, a time period and the fact that they want to find citations to that author or publication and SSCI will provide a listed search of those authors citing the author. Second, SSCI provides substantial data about the citation and citing reference that allows the investigator to see who and where the citation occurred. Third, SSCI provides a quality measure of the citations since it only indexes citations from a relatively small set of leading journals that apparently meet certain criteria. Fourth, SSCI is authorative (e.g., Garfield and Welljams-Dorof 1992). SSCI is the primary source of relied upon citation information. There has been substantial secondary use of SSCI information. For example, the company uses SSCI data as a basis to analyze research productivity of individuals or programs.

However, there are also some limitations of SSCI. It is arguable that SSCI has too few journals. Many journals are not included in SSCI in a range of areas. Even leading journals in their sub-field are not included. For example, citations are not gathered from the Journal of Information Systems and International Journal of Accounting Information Systems. On the other hand, others argue that SSCI has too many journals. Some research studies have used citations derived from a subset of the journals covered by SSCI. Increasingly, some web sites focus on a smaller set of journals. For example, in order to rank the “Top 100 Business Schools,” one web site () only considers 24 journals, leaving out virtually all but a few general interest journals within each business discipline.

Google Scholar

With the Internet and World Wide Web, there are constant changes to the ability to examine citations. Recently, Google broadened their product line to “Google Scholar” (). Google Scholar, currently in beta form has a number of capabilities including finding citations. Google Scholar does not appear to be limited to particular file types. As noted on the Google Scholar site, “Google's mission is to organize the world's information and make it universally accessible and useful. Facilitating library access to scholarly texts brings us one step closer to this goal.” Any article that is on the web or cited in a paper on the web will show up in Google Scholar.

However Google Scholar also has some limitations. First, it is in Beta so that there may be some errors. However, these are likely to be fixed as the system comes into production. Second, there is no quality control over the citation sources, in that it gathers citations from virtually all sources on the web. Third, because it gathers citations from the web, it is limited to those sources available on the web in digital and machine readable format. Scanned documents are not captured and older paper-based versions of papers are not included. Fourth, since Google Scholar gathers data from the web, potentially it could be manipulated by authors that place the research on the web. For example, the same paper with limited differences could be place on the web to push up the number of citations. Fifth, it is difficult to capture an entire set of citations to a particular journal. All instances of individual publications would need to be captured and then added up. Sixth, since Google is machine-based, there is no intermediary to gauge if there are material errors in the citation.

However, there also are some advantages. First, Google scholar is widely available and free. Second, it also is easy to use. Searches can be for a particular title of a paper that apparently is not a capability of SSCI. Third, it gathers citations from virtually all sources on the web, so that provides a broad-based assessment of citations and interest in a paper. Fourth, since many prepublication sources of information are on the web, it could be a signal as to which papers are ultimately cited by sources such as SSCI. Fifth, Google Scholar is easy to use if the interest is in finding the number of citations to a particular paper.

SCOPUS

An emerging competitor to SSCI is SCOPUS. Like SSCI, SCOPUS indexes a set of publications. Unlike SSCI, SCOPUS uses a different approach, including all articles from those publications, and the number of citations to those publications. As a result, in SCOPUS a publication can have zero citations, but still be listed. SCOPUS apparently is owned by Elsevier. As a result, it appears to index all of the Elsevier publications, among others.

SCOPUS has virtually the same limitations and advantages of SSCI. However, SCOPUS does not index the same set of journals. Unlike SSCI it does index IJAIS.

4. Downloads

Downloads generally refer to when a paper is downloaded from some source on the Internet. Although citation information has been readily available for years, digital paper download information is a relatively new phenomenon. Monitoring the number of downloads can provide a measure of the relative importance of the research. Although recently available, some faculty are requested to disclose paper download information as part of their annual performance reviews.

Sources of Download Information

Download information must be gathered from the source that holds the papers being downloaded. Typically, this is a publisher, such as Elsevier, that keeps track of such downloads, ultimately for revenue generation purposes.

Publishers and journal editors have a number of reasons to publish lists of the most downloaded papers. First, such lists provide a measure of interest in particular papers to guide potential downloaders. As a result, such lists aim to get the user to download the papers. Second, such lists can provide information that can be used by authors to tout their own research for annual performance reviews and on their vitas. For example, authors could report “Paper was number 3 among the top 25 in downloads for 2003 for XYZ journal.” Seemingly, this could provide a measure of the importance of the paper to other researchers. Third, as a result, such lists can generate interest in publishing in those journals where such information is available, so that the information can be used. Researchers might see that some group of researchers publishes in the journal and consider themselves as part of that group.

What Are and Aren’t Journals Disclosing?

While journals are disclosing the most downloaded papers, generally, the only numbers that journals are providing are the relative rankings for that journal. Seldom are the number of downloads disclosed. In 2003 and 2004 Elsevier disclosed the total number of downloads for some of their journals, however, most recently, they just provide top 25 rankings, without the total number of downloads.

Advantages and Disadvantages of Using Downloads

There are a number of advantages and disadvantages for using lists of downloaded papers as a means of assessing research “importance” of the paper.

The number of downloads potentially provides an important measure of interest in a given paper, line of research, or author. Downloading a paper is a discretionary act. As a result, theoretically, a paper generally would be downloaded because of interest in the paper. Downloading does not have the time lag that is associated with citations that must go through the publication process.

Unfortunately, downloads can be manipulated in a number of ways. Authors can download their own papers multiple times, potentially improving their position as one of the more downloaded papers. Authors can have their paper be part of a class, e.g., a Ph.D. class. Rather than copy the paper, the authors could tell the students to go to the journal and download a copy. With downloads, there is no information about who downloaded the paper and for what purpose.

Further, downloads do not necessarily result in new knowledge for the reader that leads to additional research. Although the paper may be downloaded, it might not be read and it might not be used to generate or substantiate further research.

Potentially, the number of downloads also could be influenced by the company that owns the journal. For example, periodically firms will make the papers in some issue of a journal “free.” Accordingly, there may be a greater number of downloads of free papers. Further, robot downloads increasingly have been possible. Accordingly, although a paper is downloaded, a person may never actually even see the paper.

What is a “Download?”

In addition, it may not be clear as to what is even considered a “download.” Downloads can refer to many different digital representations. At a typical site, users can download an abstract or the complete paper. Complete papers may be offered in html or in journal format. In addition, statistics gathered at the site are likely to include unsuccessful attempts at downloads, e.g., denied access. Organizations contract for a certain number of accesses over a particular time period, e.g., a month. In the case of denied access, a user may not be eligible to download a paper because the user’s institution has run out of accesses or because the organization has not contracted for the journal.

As a result, there is an interest in understanding the relationship between different kinds of accesses. For example, how correlated are abstract downloads, full paper downloads and denied accesses? In order to address this issue, I analyzed available data on downloads for the journal International Journal of Intelligent Systems in Accounting, Finance and Management. The data covered the three years 2000, 2001, and 2002, and included for each year the most downloaded 100 papers.1 Data for each of the three types of downloads were statistically significantly correlated with each other. The correlation coefficients are summarized in table 1.

Table 1

Relationship between Download Types

Correlation Coefficients for Download Types*

Year Full Text/Abstract Full Text/Denied Access Abstract/Denied Access

2000 .797 .757 .915

2001 .932 .914 .945

2002 .939 .903 .925

*All statistically significantly different than 0 at the .0000 level.

5. IJAIS Downloaded Papers by Year

I took information from Elsevier regarding the top 25 downloaded papers for each quarter of 2005 and 2006 IJAIS (2005 and 2006 were the only calendar years for which top 25 results were fully available at the time of this paper). For each of the 2 sets of 100 entries, I captured the year of the publication of each entry and determined the total number in the top 25 associated with each year, 2000-2006.2

The resulting number of downloads by year for 2005 and 2006 are summarized in table 2. In the table, the column “total” refers to the total number of articles for each year, given that the same paper can appear up to four times. The column “Total/Unique” refers to the situation where each paper is only counted once. In 2005, the largest number of downloads are from the year preceding the calendar year, while for 2006, the largest number of downloads are from 2006. In general, the further back in time that an article was published, the fewer the number of papers that are included in the top 25. Further, journal articles seem to have a “download half life” where the number of times a paper is downloaded generally decreases gradually over time. Although articles from more recent years are more heavily downloaded, articles from the entire history of the journal are still downloaded. As other years become available, the same process can be used to lay out and study download patterns.

Table 2

Downloaded Papers by Year for 2005 and 2006

Panel A: Downloaded Papers by Year for 2005

Volume Year Total Total/Unique

Volume 6 2005 14 9

Volume 5 2004 35 13

Volume 4 2003 18 6

Volume 3 2002 12 5

Volume 2 2001 13 5

Volume 1 2000 8 4

Total 100 42

Panel B: Downloaded Papers by Year for 2006

Volume Year Total Total/Unique

Volume 7 2006 27 13

Volume 6 2005 21 7

Volume 5 2004 20 8

Volume 4 2003 16 6

Volume 3 2002 7 4

Volume 2 2001 4 3

Volume 1 2000 5 2

Total 100 43

6. IJAIS Downloads vs. Citations

This section analyzes that relationship between download information and citations.

IJAIS Citation Information

IJAIS is not an indexed journal in SSCI. However, any other papers that are indexed by SSCI that cite an IJAIS paper will result in a citation for IJAIS. Since no accounting information systems journals are directly indexed by SSCI, this means that the citations provide a measure of the use of IJAIS only by those “outside” of accounting information systems.

In order to generate SSCI citation information, I searched under some papers that cited IJAIS papers to find the abbreviation used by SSCI. After finding the abbreviation that was used I searched for “Int J Accounting Inf” to find the citations in SSCI. In Google Scholar I searched under “International Journal of Accounting Information Systems.”

Although IJAIS is not indexed in SSCI, they are indexed in SCOPUS. As of November 2007, there were roughly 2.5 times as many citations of IJAIS in SCOPUS (269) as compared to SSCI (109). In contrast, it is not clear how many total citations of IJAIS can be generated using Google Scholar, because of the nature of the search engine, aimed more at findings citations to a particular paper.

IJAIS Downloads

I was unable to find any data regarding direct number of downloads for IJAIS. However, as noted above, information about which papers have been most frequently downloaded is available on the web. In particular, thirteen quarterly lists relating to the 25 most frequently downloaded papers during each quarter are listed on the web, starting with July – September 2004, and ending with July – September 2007, at the time of this paper. As a result, I used the number of “appearances” on a top 25 download list as a measure of the extent a paper was downloaded.

Approach to Analyzing Citations vs. Downloads

Although I used all thirteen lists, I limited the set of papers analyzed to those appearing, as of 2005, or earlier, so that there would be time for the papers to be cited, in order to study the relationship between citations and downloads. Thirteen quarterly lists led to 325 entries, of which 217 non unique entries were from 2000 – 2005. That set of 217 entries resulted in 41 unique papers. For each of those 41 papers, I gathered the number of appearances in top 25 lists. Then for each of those 41 papers, I gathered the number of SSCI citations on two dates, the number of SCOPUS citations and the number of Google citations. The results are presented in table 3, ranked by number of appearances on a top 25 list of IJAIS, then by SSCI citations, and then by SCOPUS citations.

Initial Findings

Analysis of the list in Table 3 provides some initial insights. Poston and Grabski (2001) was the most cited paper in each of the four different citation sources. Three papers, Hunton et al. (2003), Bradford and Florin (2003) and O’Donnell (2000) each were tied for the most appearances on a download list at 13. The median number of download list appearances was 4, and the mean was 5.29. The mean number of citations for SSCI (November and February), SCOPUS and Google were, respectively, 1.78, 1.53, 5.61, 13.32. Google likely has more citations since it is not limited to published research and encompasses virtually the entire web. SCOPUS likely has more than SSCI since SCOPUS includes IJAIS in set of cited journals.

Table 3

Citations and Appearances on Top 25 Download Lists

|Authors |SSCI-11/2007 |SSCI-2/2007 |Scopus- |Google |Number of Top |

| | | |11/2007 |11/2007 |25 Download |

| | | | | |Lists |

|Hunton et al. 2003 |5 |5 |21 |53 |13 |

|Bradford & Florin 2003 |5 |5 |16 |52 |13 |

|O'Donnell 2000 |2 |2 |6 |20 |13 |

|Malone 2002 |4 |4 |8 |24 |12 |

|Debreceny & Gray 2001 |1 |1 |11 |27 |11 |

|Bhattachraya et al. 2003 |0 |0 |2 |12 |11 |

|Poston & Grabski 2001 |12 |12 |39 |84 |10 |

|Mauldin & Richtermeyer 2004 |1 |1 |2 |1 |10 |

|Trites 2004 |0 |0 |3 |15 |10 |

|Murthy 2004 |3 |2 |4 |11 |9 |

|Hutchinson et al. 2004 |0 |0 |0 |1 |9 |

|Nicolau 2004 |2 |2 |10 |17 |8 |

|Kang & Bradley 2002 |1 |0 |5 |9 |8 |

|Rohde 2004 |1 |0 |3 |6 |8 |

|Calderon & Cheh 2002 |9 |6 |13 |14 |6 |

|Kaplan 2003 |3 |3 |7 |12 |6 |

|Ettredge et al. 2001 |4 |4 |19 |36 |5 |

|Marston & Polei 2004 |2 |1 |7 |10 |5 |

|Sutton & Hampton 2003 |0 |0 |2 |2 |5 |

|Nicolau 2000 |0 |0 |2 |11 |4 |

|Alles et al. 2004 |0 |0 |1 |6 |4 |

|Hunton & Harmon 2004 |0 |0 |0 |1 |4 |

|Liang et al. 2001 |1 |1 |2 |2 |3 |

|Dehning et al. 2004 |1 |1 |1 |1 |3 |

|Dull & Tegarden 2004 |1 |1 |1 |1 |3 |

|O'Leary 2004 |0 |0 |3 |13 |3 |

|Dilnutt 2002 |0 |0 |2 |5 |3 |

|O'Leary 2002 |6 |6 |11 |21 |2 |

|Bedard et al. 2003 |3 |1 |4 |9 |2 |

|Sutton 2000 |0 |0 |6 |19 |2 |

|Greenstein & McKee 2004 |0 |0 |1 |5 |2 |

|Nicolau 2002 |3 |3 |6 |12 |1 |

|Woodroof & Searcy 2001 |2 |1 |6 |8 |1 |

|Dull et al. 2003 |1 |1 |2 |3 |1 |

|Poston & Grabski 2000 |0 |0 |2 |7 |1 |

|O'Leary 2003 |0 |0 |1 |4 |1 |

|Lynch & Goma 2003 |0 |0 |1 |3 |1 |

|Vasarhelyi & Greenstein 2003 |0 |0 |0 |8 |1 |

|Rose, Roberts, Rose 2004 |0 |0 |0 |1 |1 |

|Bowen, Rohde & Wu 2004 |0 |0 |0 |0 |1 |

|Harmon 2004 |0 |0 |0 |0 |1 |

|Totals |73 |63 |230 |546 |217 |

|Average |1.78 |1.53 |5.61 |13.32 |5.29 |

7. Findings: Relationships between Citations and Downloads

Using the data generated in table 3, a number of different relationships were investigated. The correlations between each of the citation sources and the number of top 25 downloads is given in table 4, where each of the correlations in the table is statistically significantly different than zero at better than the .001 level.

Table 4

Correlations Between Different Sources

| |Citations |Citations |Citations |Citations |Number of Top |

| |SSCI-11/2007 |SSCI-2/2007 |Scopus- 11/2007|Google 11/2007 |25 Download |

| | | | | |Lists |

|Citations | |0.97167 |0.875105 |0.766919 |0.35835 |

|SSCI-11/2007 | | | | | |

|Citations | | |0.911255 |0.835387 |0.381872 |

|SSCI-2/2007 | | | | | |

|Citations | | | |0.94929 |0.461651 |

|Scopus- 11/2007| | | | | |

|Citations | | | | |0.54984 |

|Google 11/2007 | | | | | |

|Number of Top | | | | | |

|25 Download | | | | | |

|Lists | | | | | |

Relationship Between Citations for Top 25 Download Sample

As seen in table 4, each of the four different sets of correlations is statistically significantly correlated with each other. Perhaps not surprisingly, citations for SSCI in February and November of 2007 are the most highly correlated for our sample of top 25 downloads, since they come from the same source. Since SSCI does not index IJAIS, it is probably not surprising that both sample times of SSCI citations are not as correlated with SCOPUS or Google citations, since both of those later sources do index IJAIS articles.

Relationship Between Number of Citations and Top 25 Appearances

The number of citations was statistically and positively related to the number of appearances on download lists for each of SSCI, SCOPUS and Google, with the highest correlation occurring for Google and the lowest for the most recent SSCI.

The correlation between the number of SCOPUS citations and top 25 download lists was substantially higher than the correlation between the number of SSCI citations and the top 25 downloads. This suggests that the number of appearances on a top 25 download list is more closely related to a citation source that indexes a journal’s citations, than to one that does not index a journal’s citations.

Further, the relationship between Google citations and number of downloads is the highest. Since Google citations are not necessarily from published sources, Google may be a leading indicator of citations in published papers.

In addition, although the difference is not statistically significant, the correlation between number of appearances on a top 25 list and SSCI citations decreases over time, from February 2007 to November 2007. As additional information about top 25 lists becomes available, that relationship can be studied in more detail. Future research also could examine the relationship between the correlation between the number of citations in other sources, such as Google and SCOPUS, over time.

Top 25 Downloads and Citations vs. Time

We generally would expect the number of citations to be negatively related to year, or similarly positively related to the extent of time that the paper was available to cite, since citations accumulate over time. However, in the case of the number of appearances on a top 25 download list it is not clear that time would be significant variable, particularly since we have noted a recency effect. The correlation relationships between the different citation sources and number of top 25 downloads, and year is given in table 5. The correlation coefficients for each of the citations sources are significantly different than 0 at the .01 level or better, however, the correlation between time and number of top 25 download appearances was not statistically significant from zero.

Table 5

Correlation between Citations and Year of Publication for Top 25 Articles

|Source |Correlation |

|Citations SSCI-11/2007 |-0.25689 |

|Citations SSCI-2/2007 |-0.27333 |

|Citations Scopus 2/2007 |-0.35272 |

|Citations Google 11/2007 |-0.35067 |

|Citations Number of Top 25 |-0.03366 |

Top 25 Downloads As Compared to the Rest of the Articles

The sample of those papers appearing on a top 25 download list appears to have different citation properties than those that do not appear on such lists. Some of those differences are summarized in Table 6. The total proportion of papers that appeared in a top 25 list was .3831776. However, that means that roughly 38% of the papers accounted for roughly 67% (74%) of the citations in SSCI in November (February) 2007. Further, that same 38% of the papers accounted for over 85% of the citations in SCOPUS through the sample time period of 2005. I used a test of proportions (e.g., Dixon and Massey 1969) to find that the proportion of .3831776 was statistically significantly different than any of the citation proportions at better than a .01 level. (There are no comparative results for Google since it is difficult to obtain a single number estimate of Google citations for an entire publication such as IJAIS.) These results suggest that appearance on a top 25 download list is more likely to be associated with citations.

Table 6

Top 25 Download Lists vs. Total

| |Citations |Citations |Citations |Citations |Number of Top |Number of |

| |SSCI-11/2007 |SSCI-2/2007 |Scopus- 11/2007|Google 11/2007|25 Download |Articles |

| | | | | |Lists | |

|Total For Top 25 Download Papers|73 |63 |230 |546 |217 |41 |

|Total |109 |85 |269 | |217 |107 |

|Fraction of Total |0.669725 |0.741176 |0.855019 | |1 |0.3831776 |

8. Tensions between Google Scholar Citations and Downloads Lists

There are potential tensions between the number of citations and downloads for any given paper. As an example, for Google Scholar to capture citations, papers must be posted to the web, whether proprietary (e.g., Elsevier) or not (e.g., open posting of a paper). Journals typically create a PDF file for papers that are published. Many faculty post those papers to their home pages, making them available to others and Google Scholar. It is that kind of environment that allows Google Scholar to create citations on demand.

However, while Google Scholar needs citable papers to be on the web, if an author is interested in having their paper appear in a top 25 download list, it is better to NOT have their paper openly available on the web. Otherwise, users can find the paper on the web and will not need to download the paper from the publisher. Accordingly, publishing “top twenty-five” download lists provides at least some incentives for authors to not post papers to the web.

9. Extensions3

This paper could be extended in a number of directions. First, researchers could be surveyed regarding some of the issues discussed. For example, the advantages and disadvantages listed for using downloads as a measure could be used to establish the perceived importance of each item. Additionally, open-ended questions could be used to potentially gather other reasons. Second, a broader base of journals could be examined for each of the set of issues examined. That base could include multiple years beyond those examined here. However, the examination by the authors is limited to the data discussed here. Third, Smith (2004) studied the relationship between citations and “top articles.” That research could be extended to defining papers through other criteria, such as downloads or other sources of citations, such as Google Scholar. Fourth, actual download data, if it were available, could be compared against citation data. Although such data is not necessarily available externally, it is surely available internally, since top 25 lists are based on actual downloads. Fifth, given that this research finds a correlation between downloads and citations, future research could focus on using download information to predict the number of citations. The relationship between different download variables, e.g., number of citations or length of time on top 25 lists could be used as part of such models. Sixth, behavioral or survey research could be done to find the extent to which downloaded or cited papers are actually read.

10. Summary

“Digital” versions of research papers are now widely available. Individuals can post their research papers to the Internet for general access. Journals also provide easily accessible digital versions of their papers on the Internet. These events have led to the ability to generate new information that can facilitate evaluation of research. For example, SSCI now has citation information available in a digital format, rather than just hard copies. Competitors, such as SCOPUS and others have generated their own digital citation information. Google Scholar has come out with a new service that captures citation information available on the web. Further, some digitally available journals now post their “top 25” download lists.

This research investigated those developments and found

▪ Downloads of three different forms (abstract, paper and denied access) are highly correlated.

▪ Top 25 download lists reflect “recentness” affects and the number of “older” papers gradually declines.

▪ For the sample of top 25 downloads, the number of citations from different citation sources are highly correlated

▪ For the sample of top 25 downloads, the number of citations from different sources and the number of appearances on top 25 download lists is highly correlated

▪ For the sample of top 25 lists, the number of citations is related to time whereas number of appearances on top 25 download lists is not.

▪ For the sample of top 25 lists, the proportion of the number of citations is statistically significantly higher than the proportion of the sample of articles, for both SSCI and SCOPUS citations.

Footnotes

1. Data availability limited the search to those years.

2. ().

3. The author would like to thank the referees for some of the extensions provided in this section.

References

Alles, M. G., Kogan, A., Vasarhelyi, M., “Restoring Auditor Credibility: Tertiary Monitoring and Logging of Continuous Assurance Systems,” International Journal of Accounting Information Systems, Volume 5, Number 2, 2004, pp. 183-202.

Bedard, J.C., Jackson, C., Ettredge, M., and Johnstone, K., “The Effect of Training on Auditors’ Acceptance of an Electronic Work System,” International Journal of Accounting Information Systems, Volume 4, Number 4, 2003, pp. 227-250.

Bhattachary, S., Behara, R, and Gundersen, D, “Business Risk Perspectives on Information Systems Outsourcing,” International Journal of Accounting Information Systems, Volume 4, Number 1, March 2003, pp. 75-95.

Borokhovich, K., Bricker, R., and Simkins, B., The Journal of Finance, June 2000, Volume 55, Number 3, pp. 1457-1470, 2000.

Bowen, P., Rohde, F., Wu, C., “Imprerfect Communication Between Information Requesters and Information Providers,“ International Journal of Accounting Information Systems, Volume 5, Number 4, pp. 371-394, 2004.

Bradford, M. and Florin, J., “Examining the Role of Innovation Diffusion Factors on the Implementation Success of Enterprise Resource Planning Systems, International Journal of Accounting Information Systems, Volume 4, Number 3, pp. 205 – 225, 2003.

Calderon T.G. and Cheh J.J., “A roadmap for future neural networks research in auditing and risk assessment,” International Journal of Accounting Information Systems, Volume 3, Number 4, pp. 203-236, December 2002.

Casserly, M. and Bird, J., “Web Citation Availability: Analysis and Implications for Scholarship,” College & Research Libraries, Volume 64, Issue 4 (July): pp. 300-17, 2003. Available at ala/acrl/acrlpubs/crljournal/backissues2003b/julymonth/casserly.pdf

Davis, J. “Problems in using the Social Sciences Citation Index to Rank Economics Journals.” American Economist, Fall 1998, pp. 59 – 64.

Debreceny, R. and Gray, G.L., “The production and use of semantically rich accounting reports on the Internet: XML and XBRL,” International Journal of Accounting Information Systems, Volume 2; Number 1, pages 47-74, 2001.

Dehning, B., Dow, K., and Stratopoulos, T., “Information Technology and Organizational Slack,” International Journal of Accounting Information Systems, Volume 5, Number 1, pp. 51-63, 2004.

Diamond, A., “What is a Citation Worth?” Journal of Human Resources, Volume 21, pp. 200-215, 1986.

Dilnutt, R., “Knowledge Management in Practice – Three Contemporary Case Studies,” International Journal of Accounting Information Systems, Volume 3, Number 2, pp. 75-81, 2002.

Dixon, W. and Massey, F., Introduction to Statistical Analysis, McGraw-Hill, New York, 1969.

Dull, R., Graham, A., and Baldwin, A., “Web-Based Financial Statements: Hypertext Links to Footnotes and their Effect on Decisions,” International Journal of Accounting Information Systems, Volume 4, Number 3, pp. 185-203, 2003.

Dull, R. and D. Tegarden, “Using Control Charts to Monitor Financial Reporting,” International Journal of Accounting Information Systems, Volume 5, Issue 2, 1 July 2004, Pages 109-127.

Ettridge, M., Richardson, V.J., Scholtz, S., “The Presentation of Financial Information at Corporate Web Sites,” International Journal of Accounting Information Systems, Volume 2, Number. 3, pp. 149-168, September 2001.

Greenstein, M., and McKee, T., “Assurance Practitioners’ and Educators’ Self Perceived IT Knowledge Level,” International Journal of Accounting Information Systems, Volume 5, Number 2, pp. 213-243, 2004.

Harmon, W., “Fourth International Research Symposium on Accounting Information Systems,” International Journal of Accounting Information Systems, Volume 5, Number 4, pp. 369-370, 2004.

Howitt, M., “ISI Spins a Web of Science,” Database, April/May, 1998, pp. 37-40.

Hunton, J. E. and Harmon, W., “A Model for Investigating Telework in Accounting,” International Journal of Accounting Information Systems, Volume 5, Number 4, pp. 417-427, 2004.

Hunton, J.E., Lippincott, B. and Reck, J., “Enterprise Resource Planning Systems: Comparing Performance of Adopters and Non-Adopters,” International Journal of Accounting Information Systems Volume 4, Number 3, 165-184, 2003.

Hutchinson, P., White, C., Daigle, R., “Advances in Accounting Information Systems and International Journal of Accounting Information Systems,” International Journal of Accounting Information Systems, Volume 5, Number 3, pp. 341-365, 2004.

Garfield, E. and Welljams-Dorof, A., “Citation Data: Their Use as Quantitative Indicators for Science and Technology,” Science & Public Policy, Volume 15, Number 5, pp. 321-327, October 1992.

Kang, H. and Bradley, G., “Measuring the Performance of IT Services- An Assessment of SERVQUAL,” International Journal of Accounting Information Systems, Volume 3, Number 3, October 2002.

Kaplan, S. and Nieschwietz, R., “A Web Service Model of Trust for B2C e-Commerce,” International Journal of Accounting Information Systems, Volume 4, Number 2, pp. 95-114, 2003.

Kurmis, A., “Understanding the Limitations of the Journal Impact Factor,” The Journal of Bone & Joint Support, Volume 85, pp. 2449-2454, 2003.

Liang, D., Lin, F., Wu, S., “Electronically Auditing EDP Systems: With the Support of Emerging Technologies,” International Journal of Accounting Information Systems, Volume 2, Number 2, pp. 130-147, 2001.

Lynch, A. and Gomaa, M., “Understanding the Potential Impact of Information Technology on the Susceptibility of Organizations to Fraudulent Employee Behavior,” International Journal of Accounting Information Systems, Volume 4, Number 4, pp. 295-308, 2003.

Malone, D., “Knowledge Management: A model for organizational learning,” International Journal of Accounting Information Systems, Volume 3, Number 2, pp. 111–123, 2002.

Marston, C., and Polei, A., “Corporate Reporting on the Internet German Companies,” International Journal of Accounting Information Systems, Volume 5, Number 3, pp. 285-311, 2004.

Mauldin, E. and Richtermeyer, S., “An Analysis of ERP Annual Report Disclosures,” International Journal of Accounting Information Systems, Volume 5, Number 4, pp. 395-416, 2004.

Murthy, U. and Groomer, S., “A Continuous Auditing Web Services Model for XML-based Accounting Systems,” International Journal of Accounting Information Systems, Volume 5, Number 2, pp. 139-163, 2004.

Nicolaou, A., “A Contingency Model of Perceived Effectiveness in Accounting Information Systems,” International Journal of Accounting Information Systems, Volume 1, Number 2, 2000, pp. 91-105.

Nicolaou, A., “Adoption of Just-in-Time Electronic Data Interchange,” International Journal of Accounting Information Systems, Volume 3, Number 1, pp. 35-62, 2002.

Nicolaou, A. I., “Quality of Post Implementation Review for Enterprise Resource Planning Systems,” International Journal of Accounting Information Systems, Volume 5, Number 1, Pages 25-49, May 2004.

O’Donnell, E. and David, J.S., “How Information Systems Influence User Decisions,” International Journal of Accounting Information Systems, Volume 1, Number 3, pp. 178-203, December 2000.

O’Leary, D.E. “Knowledge Management Across the Enterprise Resource Planning Life Cycle,” International Journal of Accounting Information Systems, Volume 3, Number 2, pp. 99-110, 2002.

O’Leary, D.E., “Auditor Environmental Assessment,” International Journal of Accounting Information Systems, Volume 4, Number 4, pp. 275-294, 2003.

O’Leary, D.E., “On the Relationship between REA and SAP,” International Journal of Accounting Information Systems, Volume 5, Number 1, pp. 65-81, 2004.

Parker, A., “Link rot or how the inaccessibility of electronic citations affects quality of New Zealand Scholarly Literature,” 2006

Poston, R. and Grabski, S., “Accounting Information Systems Research: Is it another QWERTY?,” International Journal of Accounting Information Systems, Volume 1, Number 1, pp. 9-53, 2000.

Poston, R. and Grabski, S., “Financial Impacts of Enterprise Resource Planning Implementations,” International Journal of Accounting Information Systems, Volume 2, Number 3, pp. 271-294, 2001.

Robinson, L. and Adler, R., “Business Research in Eight Business Disciplines,” Working Paper presented at the International Business and Economics Research Conference, Las Vegas, October 6, 2003. Available at

Rohde, F., “IS/IT Outsourcing Practices of Small- and Medium-sized Manufacturers,” International Journal of Accounting Information Systems, Volume 5, Number 4, pp. 429-451, 2004.

Rose, J., Roberts, F., and Rose, A., “Affective Responses to Financial Data and Multimedia,” The International Journal of Accounting Information Systems, Volume 5, Number 1, pp. 5-24, 2004.

Smith, S., “Is an Article in a Top Journal a Top Article?,” Financial Management, Winter 2004, 33, number 4, pp. 133-149.

Sutton, S.G., “The changing face of accounting in an information technology dominated world,” International Journal of Accounting Information Systems, Volume 1, Number 1, pp. 1-8, 2000.

Sutton, S.G. and Hampton, C., “Risk Assessment in an Extended Enterprise Environment,” International Journal of Accounting Information Systems, Volume 4, Number 1, pp. 57-73, 2003.

Trites, G., “Director Responsibility for IT Governance,” International Journal of Accounting Information Systems, Volume 5, Number 2, pp. 89-99, 2004.

Vasarhelyi, M. and Greenstein, M., “Underlying Principles of the Electronization fo Business,” International Journal of Accounting Information Systems, Volume 4, Number 1, pp. 1-25, 2003.

Vaughan, L. and Shaw, D., “Bibliographic and Web Citations: What is the Difference?,” Journal of the American Society for Information Science and Technology, December 2003, Volume 54, Number 14, pp. 1313-1322.

Woodroof, J., and Searcy, D., “Continuous Audit: Model Development and Implementation within a Debt Covenant Compliance Domain,” International Journal of Accounting Information Systems, Volume 2, Number 3, pp. 169-191, 2001.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download