Want to make creations as awesome as this one?

Transcript

Start

Elizabeth GageGraduate Writing SpecialistThe University of Texas at Dallas

Research, Writing, & Publishing with AI

What Graduate Students &their AdvisorsNeed to know about

Google (2024)

Graduate students are expected to produce original research, placing a higher emphasis on intellectual property.

Emerging Scholars: Original Research

As emerging scholars, graduate students need to be familiar with the tools that experts in their field and industry use.

Emerging Scholars: Following Industry

There are more international students at the graduate level than at the undergraduate level, both by count and proportion.

International Students

As emerging scholars, graduate students share their original work at conferences and through theses/dissertations and journal articles.

Emerging Scholars: Publication

Why Graduate Students?

Research with AI

Literature Review

Finding Sources: Students may use AI programs to produce search terms or directly ask them to produce sources.Analyzing Literature: Students may ask AI to explain a source, synthesize multiple sources, or write a literature review.

Potential Use Cases

Using Data

Data Analysis: Students may input raw data into AI programs and ask for analysis or use AI for audio transcription.Data Visualization: Students may input data into AI programs to produce graphs, charts, and other figures.

Potential harms

Potential benefits

Potential tools

Potential harms

Potential benefits

Potential tools

Writing with AI

Generating Content

Developing or Refining Ideas: Students may ask AI to provide topic ideas or give some counterarguments to their claims.Creating New Content: Students may ask AI to produce an outline for their paper or actually write sentences or paragraphs for them.

Potential Use Cases

Revising and Editing

Developmental Feedback: Students may input their writing and ask AI programs for feedback on their organization and ideas.Grammar Feedback: Students may input their writing and ask AI for feedback on their grammar (spelling, punctuation, etc.).

Potential harms

Potential benefits

Potential tools

Potential harms

Potential benefits

Potential tools

Use & Disclose

Many journals allow some AI use, but require disclosures.

Don't Use

A small minority of journals have outright AI bans.

Other Considerations

Even if you're not violating journal policy, there's still risk.

No Policy

A large portion of journals have no AI policy yet.

Publishing with AI

MultidisciplinaryCommittee on Publishing Ethics (COPE)Hastings Center ReportSageSpringer NatureTaylor & FrancisHumanitiesThe Historical Journal

Publisher AI Policies

STEMAmerican Psychological Association (APA)Institute of Electrical and Electronics Engineers (IEEE)JAMA Network (American Medical Association)ScienceWorld Association of Medical Editors (WAME)

ChatGPTConnected PapersConsensusElicitGeminiGrammarlyHemingway EditorHumata

AI Programs

JuliusMoxieOtter.aiResearch RabbitSemantic ScholarThe Writer's DietVizly

Bhavsar, D., Duffy, L., Hamin, J., Lokker, C., Haynes, R. B., Iorio, A., Marusic, A., & Ng, J. Y. (2024). Policies on artificial intelligence chatbots among academic publishers: A cross-sectional audit. medRxiv. https://doi.org/10.1101/2024.06.19.24309148Elsevier. (2024). Insights 2024: Attitudes toward AI. https://www.elsevier.com/insights/attitudes-toward-aiGoogle. (2024). Gemini (Oct 9 version) [Large language model]. https://gemini.google.com/appInstitute of International Education. (2023). Open Doors report on international educational exchange: International students by academic level, 1999/00 - 2022/2023. https://opendoorsdata.org/data/international-students/academic-level/National Center for Education Statistics. (2022-2023). Integrated Postsecondary Education Data System: Enrollment by level of student, attendance status, and gender, Fall 2022 (provisional release data) [Data set]. U.S. Department of Education, Institute of Education Sciences. Retrieved October 24, 2024 from https://nces.ed.gov/ipeds/SummaryTables/report/202Thorp, H., & Vinson, V. (2023, November 16). Change to policy on the use of generative AI and large language models. Editor's Blog. https://www.science.org/content/blog-post/change-policy-use-generative-ai-and-large-language-models

References

As of December 2023, 107 out of 163 (66%) academic publishers in the Association of the Scientific, Technical, and Medical Publishers (STM) had no published AI policy (Bhavsar et al., 2024).This is somewhat congruent with my own findings: I found several STEM journals with no published AI policy, and I could only find one humanities journal with a policy. I could not find any prominent composition/rhetoric or literature journal with a published policy.

No Policy

Literature Review: Benefits

  • If a student has no idea what search terms to use, AI programs can offer some helpful terms for getting started.
  • If the topic is well-researched and has been published about abundantly, AI programs are more likely to find some foundational sources.
  • Some AI tools have been created specifically for research, and these tools are more effective at producing relevant search terms and specific sources.
  • Chatbots can sometimes produce helpful summaries about general topics, such as a historical figure's beliefs or a well-known theory or philosophy, which can help a student get started.

Revising and Editing: Benefits

ver

  • In the absence of human feedback, AI programs can offer some helpful suggestions for improving organization and/or clarity of ideas.
  • If prompted to do so, chatbots can identify claims that may need a citation.
  • AI programs (such as Grammarly) can catch grammar issues and offer suggestions to correct them.
    • This is especially helpful for students unfamiliar with certain grammar rules (e.g., so many students don't know comma rules).
    • International students in particular may benefit, as they may misuse words unknowingly or omit articles where necessary.

    Using Data: AI Tools

    ver

    • Otter.ai:
      • Use Cases: Audio transcriptions (from interviews, meetings, etc.) (with free options).
      • Privacy: Otter.ai trains its AI on de-identified audio recordings.
    • Julius AI:
      • Use Cases: Chat with your data, perform advanced analysis, create visualizations, and generate reports (with free options).
      • Privacy: Julius guarantees that no other users can access your data, does not train its AI using your data, and your data is completely erased from the servers when you delete it from the app.
    • Vizly:
      • Use Cases: Chat with your data, perform advanced analysis, create visualizations, and generate reports (with free options).
      • Privacy: Vizly guarantees that no other users can access your data, and your data is deleted permanently after one hour of inactivity.

    Literature Review: Harms

    • AI programs may produce irrelevant search terms, omit important search terms, or be unable to give search terms specific enough for the scope of the literature review.
    • Chatbots like ChatGPT and Gemini are not optimized for scholarly research, so, when prompted to give a list of sources, they sometimes:
      • retrieve irrelevant sources
      • produce fake sources
      • omit important resources
    • Some chatbots (e.g., ChatGPT) don't have access to recently published sources.
    • When giving research summaries, chatbots may (a) produce copyrighted information without citations or sources or (b) fabricate information or data to support unfounded claims.

    In response to the prompt below, Gemini produced 4 fabricated sources and one "hypothetical example." Prompt: "Give me specific journal articles related to my thesis statement [the presence of bees reduces crime in urban parks ]."

    Using Data: Benefits

    ver

    • Some AI programs have more stringent data policies and don't add user activity to their databases. Some even allow users to erase their data from the company's servers.
    • Using AI for time-consuming tasks, such as audio transcription or initial data analysis, can make the research process much more efficient.
    • Many AI programs are optimized for data analysis, so their output is much more reliably accurate than that of chatbots like ChatGPT and Gemini.
    • Using AI to analyze data can identify trends that the student may have missed (especially if they were not directly related to the research questions) that may be relevant for explaining patterns elsewhere.

      31% of researchers and clinicians have used AI for work, 67% expect to use AI in the near future, and 72% expect its impact on their work to be transformative or significant.

      Statistics:

      Elsevier (2024)

      Generating Content: AI Tools

      Gra

      • There are so many of these.
      • Grammarly, obviously.
      • Word's Editor, of course.
      • The Writer's Diet: Improves readability by prioritizing concision.
      • Hemingway Editor: Improves overall readability.
      • Moxie (formerly Academic Insights Lab): Not free, but offers detailed, nondirective writing feedback.

        Generating Content: AI Tools

        • ChatGPT and Gemini: Both chatbots work well for generating and developing ideas. You can ask for topic ideas, have the chatbot play devil's advocate to critique your ideas, or even produce an outline for your paper.
        • Consensus: Basically ChatGPT for scientific writing. Finds and summarizes sources, synthesizes multiple sources, and drafts text based on the sources found.
        • Humata: Chat with sources. Ask for summaries or probe for information relevant to your research question.
        • In general, this is probably the use case with the most limited effective applications of generative AI.

          Revising and Editing: Harms

          ver

          • Data privacy is still a concern for unpublished writing.
          • Because AI programs are trained by programmers on internet sources, they are usually not equipped to address the nuances of academic genres.
          • Chatbots seem to excessively lean toward concision, cutting out important definitions or nuances in an argument for the sake of being concise.
          • AI programs may incorrectly diagnose grammar issues (as seen in Grammarly) or miss other grammar issues.
          • As we've discussed this week, chatbots cannot replicate the human element of being able to ask clarifying questions and offer nondirective guidance as human tutors can.

            • Copyright Concerns:
              • AI may produce copyrighted information (such as information from books or journals) without citations. If a student uses that information in their own publication (journal article, thesis/dissertation, etc.), then they risk copyright infringement, which could even lead to legal action.
              • The legal ownership of content created by AI programs is still unfolding. If a student uses AI programs to generate content without attribution, they may be violating the AI programs terms of use.
            • Unpublished and Changing Policies: While plenty of journals haven't yet published AI policies, they may be underway, or there may be an unspoken bias against AI use. Students should tread carefully when they don't have an explicit policy to follow.

            Other Considerations

            Using Data: Harms

            ver

            • Many open-source AI programs add users' input to their repositories, meaning that
              • a student's data may be reproduced (without citation) for another user
              • a student's data may be read by a human reviewer
              • any time chat logs are stored, they are at risk of being accessed during a data leak
              • this is especially problematic for proprietary or sensitive data (e.g., protected by IRB, FERPA, or HIPAA)
            • AI programs may inaccurately:
              • analyze data, leading to fabricated results
              • transcribe audio, leading to errors in results and quotations
              • create figures and illustrations based on data, leading to faulty depictions of results

            Literature Review: AI Tools

            • Semantic Scholar: Finds sources, provides a TLDR, and identifies highly influential citations.
            • Research Rabbit: Finds sources and provides interactive visualizations of author networks and related papers.
            • Connected Papers: Finds sources and provides interactive visualizations of author networks and related papers.
            • Elicit: Finds sources and identifies common themes. Can extract information (e.g., methodology, results) from multiple sources and present in table format. Also allows users to ask questions about a topic and receive responses with (real) citations, as well as chat with sources.
            • Consensus: Basically ChatGPT for scientific writing. Finds and summarizes sources, synthesizes multiple sources, and drafts text based on the sources found.
            • Humata: Chat with sources. Ask for summaries or probe for information relevant to your research question.

              In Fall 2022, only 2.3% (347,602) of undergrads were international, compared to 14.7% (467,027) of grads. The proportion of international students at the graduate level has been rising since 2020.

              Statistics:

              Institute of International Education (2023);National Center for Education Statistics (2022-2023)

              • Authorship: AI may never be listed as an author.
              • Grammar Support: AI use solely to improve grammar and readability does not need to be disclosed. This is basically seen as copy editing.
              • Creating Content: AI use for creating text, figures, images, and code need to be disclosed. This usually happens in the acknowledgments and/or methods. Sometimes transcripts of AI use must be attached. Some publishers prohibit or strongly discourage AI-generated images or multimedia (often due to legal issues).
              • Analyzing Data: This isn't often addressed in publishers' AI policies, but when it is, disclosure is required.

              Common Policies

              Generating Content: Harms

              • If the AI program is unfamiliar with a topic, it might make overly broad statements and lack the specificity needed in graduate writing.
              • AI programs replicate the biases and misinformation found in its training data (i.e., the internet), though chatbots are often specifically trained to mitigate that.
              • AI writing is often perceived as repetitive and/or flat, so its output will likely require at least moderate revision.
              • Using AI for creating new content, such as sentences and paragraphs, is sometimes considered plagiarism.
              • Again, AI programs may produce copyrighted information without sources.

              As of December 2023, four out of 56 (7%) academic publishers in the Association of the Scientific, Technical, and Medical Publishers (STM) with published AI policies completely banned AI use by authors (Bhavsar et al., 2024).This small number is congruent with my own findings: I did not find any journals that banned any AI use whatsoever. Science actually relaxed its policy slightly in 2023 to allow some AI-generated content (Thorp & Vinson, 2023).

              Don't Use

              Generating Content: Benefits

              • Asking chatbots for research gaps or popular topics in a field can provide a fruitful list of ideas for getting started. Because the conversation is individualized, students can even give a list of their interests and receive tailored topics for research.
              • Asking chatbots to play "devil's advocate" can reveal gaps in logic or underlying assumptions that need to be made explicit.
              • An initial outline from an AI program can help a student begin organizing their ideas if they're unsure where to get started.
              • AI programs can help students craft often troublesome sentences and paragraphs (such as transition sentences to link ideas, introductory sentences or paragraphs, and conclusion sentences or paragraphs).