LiveText is a browser-based web application that enables institutions to systematically collect data for assessing student learning. Through its digital workspace, LiveText has the capability to measure assignments, journals, evaluations, and academic artifact collections, all of which are grounded in student learning outcomes and the institutional mission. The system stores and organizes data for reporting using analytics.
Frequently Asked Questions
How do I obtain a LiveText account?
If you are interested in using LiveText, please contact Xizhu Wang to request an account.
I have an account, but I need some help starting.
- Assessment Tutorial: How to assess a student assignment
- How to complete a AIS Report in LiveText
- How to review an AIS Report in LiveText
- How to complete an assignemnt in LiveText - ILO Project
- LiveText Purposeful Assessment Planning
- LiveText Help Center
- LiveText Webinar Training (ongoing): Schedule & Descriptions and Registration
Is there a support team at Pepperdine that can help me?
- Technical Support: Xizhu Wang
- Assessment Consultant: Lisa Bortman
- Seaver College: Xizhu Wang and Kailee Rogers
- Graziadio Business School: James Berneking
- School of Law: Katie Dodds
- Graduate School of Education And Psychology: Amy Tuttle-Guerrero
- School of Public Policy: Michael Shires
- Student Life: Brad Dudley
National Assessment Websites
- National Institute of Learning Outcome Assessment, NILOA
- American Association of Colleges and Universities, AAC&U
- American Association of Higher Learning, AALHE
- American Association of University Professors, AAUP
- American Association of College Professors, AACP
- Quality Matters (good site for rubrics)
- What is "Good" Assessment? A Synthesis of Principles of Good Practice
- Assessment Update
- Practical Assessment, Research, and Evaluation
- Research & Practice in Assessment
- The Journal of Assessment and Institutional Effectiveness
Alignment - The alignment process connects dimensions of the University that support achievement of goals: Student Learning Outcomes reflect and advance Program Learning Outcomes which reflect and advance Institutional Learning Outcomes.
Advancement of Student Learning Council (ASLC) - The University-wide group which oversees the assessment process for the entire University; there is one representative from each school. The Associate Provost for Assessment and Institutional Effectiveness serves as an ad hoc member.
Assessment Plan - The assessment plan is a schedule for examining the PLOs, including the type of evidence and who is responsible for conducting the review and closing the loop. A program should rotate through assessment of all of its PLOs in a 5-year cycle.
Authentic Evidence - Authentic evidence refers to assessment which measures a student's ability to apply his or her knowledge in real world applications and involves having experts outside the University evaluating the student's knowledge --as in an internship--rather than using an academic construct such as a test.
Benchmarking - This involves using comparisons to make meaning of empirical data. When you're benchmarking, you don't just look at a given number, say, 68% graduation within 6 years; you look at it in comparison to some standard that helps you to decide how good or bad your number is: say, in comparison to your in-house goal of 85% graduation in 6 years, or the 75% graduation rate of peer institutions. A benchmark is a point of reference.
Capstone - The capstone is a culminating project or experience that generally takes place in the student's final year of study and requires review, synthesis, and application of what has been learned over the course of the student's college experience. The result may be original research, an innovative design, an art exhibit, or a performance. The capstone can provide evidence of assessment of a range of outcomes.
Closing the Loop - This is an iterative ongoing four-step process: 1. defining learning outcomes, 2. choosing a method or approach and using it to gather evidence of learning, 3. analyzing and interpreting the evidence, 4. using this information of improve student learning. The cycle must be completed and repeated to see whether the changes have produced the desired result.
Core Competencies - Graduating students should demonstrate mastery in five areas: written and oral communication, quantitative reasoning, critical thinking, and information literacy.
Curriculum Map - This is a map outlining the Program Learning Outcomes promoted by each course in the discipline. "I" indicates courses that introduce the PLO, "D" indicates courses that develop the PLO, and "M" indicates courses that result in a mastery of the PLO.
Direct Evidence - Evidence gathered from a performance-based observation or sampling of student work. Example: using a rubric to evaluate the quality of student papers. Direct evidence may include locally-developed tests, performance appraisal, oral examinations, simulations, behavioral observations, portfolios, external examinations of student work, and other course activities assessed with rubrics.
Diversity - Diversity involves multiple perspectives and the representation and recognition of people of different backgrounds and points of view in the various constituencies of the university: students, faculty, staff, and administration.
Educational Effectiveness Indicators (EEIs) - EEIs are also known as educational performance indicators and are a list of direct and indirect ways in which the University examines student learning.
High-Impact Practices (HIPs) - HIPs include first-year seminars, writing-intensive courses, collaborative assignments, undergraduate research, service learning, internships, capstones, and international programs.
Indirect Evidence - Evidence that assesses the perceptions of students or faculty. This evidence may be collected through student surveys, questionnaires, focus groups, archival records, interviews, and other indirect methods. Example: using a survey to assess how students perceive the quality of your discipline's writing curriculum.
Institutional Learning Outcomes (ILOs) - The specific knowledge or skills students should actually acquire/develop through their educational experience while at Pepperdine University. These should be broad and general also that a wide variety of Program Outcomes can relate to each.
Internal Review - The review of each program review report which is conducted by the Advance of Student Learning Council (ASLC).
Inter-rater Reliability - This is a group activity used to calibrate or norm a rubric; this means that each member of the group applies the rubric to evaluate a product or behavior; then the members of the group compare judgments and discuss the basis of the judgments; the goal is to achieve consistency in applying standards.
Memorandum of Understanding (MOU) - The MOU is developed by the dean as a part of "Closing the Loop" and outlines the guidelines, expectations, and plans for program improvement over the next five years.
Norming - In assessment of student learning norming is a process of training rater to use rubrics to evaluate student products and performances consistently.
Numerical Expectations (benchmark) - A quantitative measure used to indicate whether student work exceeded, met, or fell below expectations. This expectation is set prior to the assessment and used to evaluate the results in the findings section. Example: stating in the introduction that the discipline expects to see 85% of its students writing at a level of "satisfactory" or above.
Program Learning Outcomes (PLOs) - The specific knowledge or skills students actually acquire/develop through their educational experience in a particular disciplinary program or major. These should align with the Institutional Learning Outcomes and the Student Learning Outcomes should align with them.
Rubric - A rubric is a simple assessment tool for measuring student work by classifying statements or behaviors into categories along a continuum. It is a scoring guide that evaluates a student's performance based on a full range of criteria rather than a single numerical score. When administering rubrics, it is ideal to use multiple raters and to establish inter-rater reliability among raters.
Student Learning Outcomes (SLOs) - The specific knowledge or skills students actually acquire/develop through their educational experience in a particular course. These appear on the syllabi and should be aligned with the Program Learning Outcomes.
University Academic Council (UAC) - The UAC is the chief policy-making body for academic procedures, policies, and requirements in the schools. It reviews proposals submitted by the academic council of a particular school for changes or additions to the curriculum, graduation requirements, and general academic policies.
University Planning Committee (UPC) - The UPC is the body responsible for coordinating university level assessment, planning, program review, and resource allocation processes.
University websites that offer good resources and good examples of assessment activities:
- California State University Northridge (CSUN) Academic Assessment & Program Review
- Cal Poly San Luis Obispo Academic Programs and Planning
- Loyola Marymount Office of Assessment
- Loyola Marymount Undergraduate Learning Goals and Outcomes
- Loyola Marymount Overview of Assessment
- UC San Diego Assessment Toolkit
- UC San Diego Student Affairs Assessment Resources
- University of Texas Center for Teaching and Learning Assess Learning
- St. Olaf College Institutional Research and Effectiveness
- University of Iowa Student Life Assessment
- UC Davis Assessment