EBSCO and Summon Discovery Generator Tools: How Accurate Are They? Ken Laing – Instructional Services and Digital Initiatives Librarian, klaing@selkirk.ca, 301 Frank Beinder Way, Castlegar, BC V1N 4L3 Kathleen James – Librarian, kjames@selkirk.ca, 301 Frank Beinder Way, Castlegar, BC V1N 4L3 Manuscript Abstract Most college and university libraries in British Columbia are using either EBSCO or Summon Discovery search services as a “one stop shop” for users to find, save, and cite resources for their projects. The citation generators that both EBSCO and Summon offer simplify the process of inserting citations into essays, however the citations need to be accurate for users to be able to submit citations that are error free. Considering this, this article looks at the accuracy of the citation generators each service offers. A total of twenty resources were picked from student library assignments for print books, eBooks, and articles in APA, Chicago, and MLA styles to determine the number and types of errors that occur for EBSCO and Summon Discovery. The error rate was substantial enough that caution must be used when creating citations using these tools and librarians should advise users that Discovery citation generators cannot be relied upon to provide accurate citations. Introduction According to Clark and Oppenheim (2006) 80% of undergraduate student papers contain bibliographies free of error which at first glance seems impressive. However, that leaves 20% of students submitting assignments with bibliographies that contain errors of varying degree. Each month approximately 20% of all questions that come through AskAway, British Columbia’s online post-secondary reference service, involve citation questions (AskAway Session Data 2021). At XX College, approximately 60% of in person reference transactions involve citation questions which is not surprising given that students potentially work with up to four different citation styles simultaneously depending on the courses they are taking (XX Library Reference Stats 2019). While it is imperative that students have the knowledge to create citations manually, having the ability to generate citations automatically through library search services such as EBSCO Discovery and Summon Discovery, the two most common search services in postsecondary institutions in British Columbia, is undoubtably welcomed by students at all levels of post- secondary education. However, the citations that search services create need to be correct for students to cite their work properly for their assignments. The one-stop shop nature of search services is indeed enticing: students can search for resources across a broad spectrum of databases, save and download articles and then use the provided citation tools to cite their work. Yet, just how accurate are the citations that these search services provide? With this question in mind, this article examines the citations generated by both Summon Discovery and EBSCO Discovery citation tools for print books, eBooks, and articles to determine how common it is for these services to generate citations that contain errors and if so the types of errors that are made. Literature Review Web based citation generators or reference managers, both open source and proprietary, have been studied in the past but citation generators offered within library search services have seen less analysis. In 2005 Kessler and Van Ullen studied the citation generators NoodleBib, EasyBib, and EndNote and concluded that all three products made errors in their citations. Of the total errors in total Noodle Bib made the fewest errors, EndNote made the second most errors and EasyBib made the most. Van Ullen and Kessler (2012) in a follow up to their 2005 study examined 12 individual databases and the accuracy of the citation tools each one offered and determined that the error rate amongst the products was exceptionally high. Zhang (2012) reviewed four of the main web-based citation generators: EndNote, Zotero, Connotea, and Mendeley, and analyzed the access, collection, organization, collaboration, and citation/formatting features of each. In terms of citations, the mechanics and ease of generating them was reviewed across each platform but error type and rate were not investigated. Chang (2013) studied the web-based, open-source citation generators EasyBib, NoodleBib, BibMe, KnightCite, Citation Machine, Citation Builder, and SourceAid and analyzed the number of errors each generator made based on the citation styles of MLA, APA, and Chicago. Chang’s study revealed EasyBib and NoodleBib to have a high accuracy rate for citation generation. A variety of resources, both print and electronic, were input into each generator to determine overall citation accuracy which revealed EasyBib and NoodleBib to have a high accuracy rate. Homol’s (2014) study investigated the citation accuracy of RefWorks, EndNote Basic, Zotero, and EDS when used to generate citations for electronic journals in MLA and APA. Overall, not one product was able to generate an error free and no product stood out as substantially better than another. Methodology To examine the citations generated from EBSCO Discovery Service and Summon for accuracy, 60 citations were generated in APA, Chicago, and ML, from sources across both platforms. From November to December 2021, citations were generated for 60 records from EBSCO Discovery Service and Summon Discovery in APA, Chicago, and MLA styles. The sources used for this project included 20 print books, 20 eBooks, and 20 journal articles available through both platforms. These sources were chosen from student library research assignments which required students to create a research question and locate appropriate sources to use in their final paper. Each source was available through EBSCO and Summon and citations were generated in APA, Chicago, and MLA using each platform’s citation generator. The citations were then checked for accuracy according to the APA Manual 7th edition, the Chicago Manual of Style 17th edition, and the MLA Manual 9th edition. The citations were entered into a spreadsheet and checked for accuracy with all errors recorded. The criteria for checking the accuracy of references from print books were punctuation, author, date, place of publication, publisher, book title, italics, edition, and additional information. For eBooks, the criteria were punctuation, author, date, eBook title, publisher, place of publication, edition, italics, additional information, DOI/URL, and database information. The criteria for journal articles were punctuation, author, date, article title, journal title, volume/issue, page range, DOI/URL, italics, database/collection, and additional information. All errors that occurred in a citation were counted and marked in the spreadsheet. If more than one error from a category was found in a citation, all would be counted towards the total number of errors made. For example, if two words in a book title were not capitalized per the citation style guidelines, this would count as two errors. All errors can be found in figures 3 through 8 below. Discovery Results Books APA Of the 20 print books that APA citations were generated for in Discovery, six were entirely correct. Out of 14 citations that were incorrect 12 had one error and two had multiple errors; one had three errors and one had two errors. Every capitalization error, 13 in total, was the non-capitalization of subtitles. One book had two subtitles, therefore had two subtitle capitalization errors. The next most common citation error was edition statements with 3 errors in total. One book had an edition statement for a first edition and had the edition in square brackets. The second book had an edition statement in number word rather than numeral. The final most common error type was punctuation which occurred once. Double dashes were inserted between the title and publisher. Books Chicago All 20 print book citations contained errors in Discovery and three books had two errors. Place of publication was missing for all the citations, two citations had missing edition statements, and one had an edition statement for a first edition. Books MLA Citation errors for print books in MLA were much less prominent, 15 out of 20 were correct. One citation had an edition statement in square brackets with a dash between it and the publisher statement. Another citation had revised and expanded fully written out when rev. and exp. ed. is prescribed in MLA. There was also a period inserted between the edition and publisher. The final citation error had an edition statement written as a word rather than a number. Articles APA Out of 20 APA article citations in Discovery, only two were correct. 18 citations contained a total of 22 errors. Two citations had two errors while one citation contained three errors. 14 article citations contained titles that had every word capitalized. The next most common error – 4 citations - was a missing DOI (Digital Object Identifier) or URL. A total of three article citations had author errors, two of which had the author names in all capital letters while one had a three-author citation which listed the first author’s name then et. al. for the second and third. Finally, one citation had a missing page range. Articles Chicago A total of 14 article citations in Chicago were correct which left six citations that contained seven errors. There were three author errors, all having the author names in all capital letters. Two citations did not have dates in brackets, while two were missing the page range. Articles MLA Fifteen article citations in Discovery were correct, a slight improvement over Chicago citations and a substantial improvement over APA. Five MLA article citations in Discovery contained five errors. Three citations contained author names in capital letters and two citations had the page ranges missing. eBooks APA In Discovery, 10 eBook citations contained 20 errors in total. Author citation errors were the most common, 10 in total and in all cases author names were not inverted to last, first. Title citation errors totaled nine and in all cases all words were capitalized rather than just the first. Finally, one citation had an edition statement written in word rather than number. eBooks Chicago Correct eBook Chicago citations in Discovery totaled 10 while incorrect citations totaled 10 with 11 errors in total. There were 10 author citations with the names not inverted to last, first and one title was in all capital letters. eBooks MLA As with print book and article citations, Discovery made the fewest eBook citation errors. In total 10 eBooks contained 10 author errors. Again, as with eBook citations in APA and Chicago the author’s names were not inverted to last, first. Summon Results Books APA Of the 20 citations generated for print books from Summon using APA, 9 were correct. The most common error present was capitalization, with 12 errors. All capitalization errors occurred with proper nouns found in the title. The second most common issue found with the APA book citations was author errors, with 5 in total. Summon consistently added additional author information to this category such as library collection and publisher information. Two citations had incorrect edition statements resulting in 2 errors. Finally, the place of publication was added to one citation and another citation had incorrect punctuation, which resulted in 2 errors. There were 21 errors in total for the Summon print book citations. Books Chicago Citations generated by Summon using Chicago style had 9 out of 20 correct citations with a total of 17 errors. Place of publication and author were the two categories where the most errors occurred, with 5 each. Summon added additional places of publication to three citations for a total of 5 errors and added additional author information to five citations. Summon also provided incorrect punctuation which resulted in 4 errors in four citations and made two edition statement mistakes which resulted in 2 errors. Finally, additional information was added to one citation which resulted in 1 error. Books MLA Summon was not able to generate any correct MLA citations for print books. The most common error was place of publication added to the citations which is not required in the 9th edition of the MLA manual. In two cases, there were multiple places of publication added to the citation, so each was counted as an error. The author category had 4 errors with the library collection and additional groups being added. A punctuation error was present in one citation and another citation had added additional information for a total of 29 errors. Articles APA Summon was able to produce 10/20 correct APA article citations with a total of 20 errors. Capitalization was the largest error category with 13 errors occurring in the journal title. These errors all included proper nouns not being capitalized. In two instances, citations had multiple words not capitalized so each was counted as an error. The issue/volume category had 3 errors from two citations and punctuation, author, date, and page range all had 1 error each. Articles Chicago Summon was not able to generate any correct Chicago article citations. In twenty citations, the DOI/URL was missing which resulted in 20 errors. Volume/issue number had 3 errors, and 2 capitalization errors also occurred in two citations. There was also 1 author error and 1 date error resulting in a total of 27 errors for the citations generated in Chicago style by Summon. Articles MLA Summon was not able to generate any correct article MLA citations. There were 47 errors in total with 20 errors coming from a missing DOI or URL. Another 20 errors occurred as Summon did not include the database information for any of the articles. Two articles were missing volume/issue number information that was available in the article record this resulted in 3 errors. Page range, date, author, and title capitalization all resulted in 1 error each. eBooks APA Summon was only able to generate 2 correct APA eBook citations out of 20 with a total of 31 errors. 18 errors were found in the author category with the issue arising from additional authors being added. The second most prevalent error category was title capitalization. There were 9 errors found in six citations and all came from Summon not capitalizing proper nouns. There were 2 errors found in the edition statement category and 1 error each in punctuation and date. eBooks Chicago Summon was only able to generate one correct Chicago eBook citation and the twenty citations had a total of 53 errors. The most errors, at 20, were found in the author category as Summon consistently added the eBook library collection to this part of the citation. Fifteen citations were missing a DOI, URL, or the database the eBook was found, per the Chicago Manual of Style 17th Edition, which resulted in 15 errors. Mistakes in place of publication resulted in 10 errors, and mistakes in punctuation and added information gave 3 errors each. There were 2 errors found in edition statements and 1 error found in a publisher statement. eBooks MLA Summon was not able to generate any correct eBook citations in MLA format. The category with the most errors was the database section as Summon did not add this to any of the citations and resulted in 20 errors. Place of publication had 18 errors and DOI/URL and author/editor each had 15 errors. Punctuation in the citations resulted in errors and Summon adding additional information to the citations gave 4 errors. Finally, the publisher category had 2 errors and there was 1 error with a citation date. There were 80 errors in Chicago eBook Summon generated citations. Discussion Of the citations produced using the citation generators from EBSCO Discovery Service and Summon, Discovery produced the most accurate citations with 86 out of180 correct across print books, eBooks, and journal articles (see fig. 1). Summon was only able to produce 31 out of 180 correct citations (see fig. 2). In Discovery, MLA style had the most correct citations with 42 out of 60, while in Summon MLA performed the worst with 0 out of 60 correct. Discovery was able to generate 25 out of 60 correct Chicago citations while Summon had 10 correct. However, Summon did outperform Discovery in APA with 21 out of 60 correct citations, while Discovery was only able to generate 19 correct APA citations. Fig. 1 Discovery correct citations Fig. 2 Summon correct citations In fig. 3, the number of citation errors for print books from Discovery shows that Chicago had the most citation errors, with 22 errors in total. APA had 17 errors in print books, while MLA had the fewest number of errors with only 5. In fig. 4, we see print book citations produced by Summon had fewer Chicago citation errors with 17. However, Summon print book citation errors in APA were higher than Discovery with 21, and in MLA with 29 errors. Also found in fig. 3 and fig. 4 are the number of errors by location. For APA print book citations in Discovery and Summon, the most errors occurred in the book title and place of publication. In the title, frequent non-capitalization of words was the sole cause of the errors from both platforms. Place of publication in Discovery Chicago citations had a high number of errors due to missing information. In Summon, place of publication had a high number of errors in the MLA citations due to the place of publication being added which is not required as part of the MLA 9th edition guidelines. Fig. 3 Number of print book citation errors from Discovery Fig. 4 Number of print book citation errors from Summon Fig. 5 shows that Discovery journal article citation errors were the highest in APA, with 22 in total. Errors in citations generated in Chicago and MLA style by Discovery were low, with 7 and 5, respectively. Fig. 6 shows that in Summon, APA article citation errors were fewer than in Discovery, with 20 in total. Summon citation errors for articles were higher in Chicago with 27 and in MLA with 47. In Discovery and Summon, the author category had a high error count across all citation styles, as well as high errors in eBook titles in APA style. In Summon, additional high error categories were the place of publication in Chicago style due to multiple places added to the citation, and in MLA due to the category being added to the citations when it is not required. Finally, Summon had frequent errors in eBook Chicago and MLA citations due to missing DOI/URLs and missing database information in MLA. Fig. 5 Number of article citation errors from Discovery Fig. 6 Number of article citations errors from Summon In fig. 7, we see that in Summon, the highest number of eBook citation errors came from citations generated in APA style, with 20 errors in total. Chicago style citations followed with the second highest number of errors with 11, and citations in MLA had 10 total errors. Final ly, fig. 8 shows that Summon eBook citation errors were higher than Discovery across all categories. Summon MLA eBook citation errors were the highest with 47 in total, while Chicago had 27 errors and APA with 20. A high error category in both Discovery and Summon article citations was article title in APA citations. Both platforms frequently failed to capitalize appropriate words per the APA manual 7th edition. In Summon, frequent errors occurred again in Chicago and MLA citations due to missing DOI/URLs and in MLA article citations due to missing database information. Fig. 7 Number of eBook citation errors from Discovery Fig. 8 Number of eBook citation errors from Summon Conclusions EBSCO Discovery Service and Summon are the two most popular search platforms found in postsecondary libraries across British Columbia. Despite this wide usage and the ease with which students use these platforms to generate citations, this study concludes that neither platform can reliably produce accurate citations in APA, Chicago, or MLA formats. These findings are limited by the scope of the study, including the number of citations that were checked across both platforms, any platform updates that might occur, and the fact that they were not compared with other citation generators such as Zotero, RefWorks, Mendeley, or Endnote. Despite these limitations, knowing the frequency with which errors occur in citations generated from both platforms will help inform our library instruction. Homol (2014) aptly pointed out that knowing what type of errors occur the most frequently in the citation generators and their location is important. This type of information could be communicated to students who might not be aware of the number of errors that occur in these generated citations (Homol 2014). Knowing this information might lead students to seek out other types of citation generators or thoroughly check the citations generated from Discovery or Summon for errors. Finally, librarians need to be aware that the products they are demonstrating to users as part of instructional programs are not providing accurate citations and they need to advise users to be cautious when submitting work that contains citations from EBSCO and Summon Discovery References BCELN. 2021. “Askaway Usage Statistics.” https://askaway.org/staff/statistics Chang, Hui-Fen. 2013. “Cite It Right: Critical Assessment of Open Source Web-Based Citation Generators.” LOEX Conference Proceedings 2011 10 (December). https://commons.emich.edu/loexconf2011/10. Clarke, Maria Elizabeth, and Charles Oppenheim. 2006. “Citation Behaviour of Information Science Students II: Postgraduate Students.” Education for Information 24 (1): 1–30. https://doi.org/10.3233/EFI-2006-24101. Homol, Lindley. 2014. “Web-Based Citation Management Tools: Comparing the Accuracy of Their Electronic Journal Citations.” The Journal of Academic Librarianship 40 (6): 552–57. https://doi.org/10.1016/j.acalib.2014.09.011. Kessler, Jane, and Mary K. Van Ullen. 2005. “Citation Generators: Generating Bibliographies for the Next Generation.” The Journal of Academic Librarianship 31 (4): 310–16. https://doi.org/10.1016/j.acalib.2005.04.012. Van Ullen, Mary, and Jane Kessler. 2012. “Citation Help in Databases: The More Things Change, the More They Stay the Same.” Public Services Quarterly 8 (1): 40–56. https://doi.org/10.1080/15228959.2011.620403. XX Library. 2019. “Information Desk Statistics.” Zhang, Yingting. 2012. “Comparison of Select Reference Management Tools.” Medical Reference Services Quarterly 31 (1): 45–60. https://doi.org/10.1080/02763869.2012.641841.