Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Progress Diary (Citizen Science and Chemistry) #4

Open
lq33shan opened this issue Dec 3, 2019 · 12 comments
Open

Progress Diary (Citizen Science and Chemistry) #4

lq33shan opened this issue Dec 3, 2019 · 12 comments

Comments

@lq33shan
Copy link

lq33shan commented Dec 3, 2019

Hi everyone,

This is Sarah and I am currently working on a project relating to Citizen Science and Chemistry. Citizen science is a type of scientific research activity where members of the general public can participate in the process of data collection and analysis. The aim of this project is to develop resources and infrastructure for open learning and crowdsourced approach to research that are guided by pedagogical research and open science principles.
This issue is a progress diary documenting my work during the summer research period of December 2019 to February 2020.

@lq33shan
Copy link
Author

lq33shan commented Dec 4, 2019

03/12/2019-04/12/2019

As the project has just started, the following questions need to be answered before a formal research question can be formulated:

  1. What is citizen science?

  2. Examples of chemistry citizen science projects with references to these projects

  3. How we can define/describe the types of chemistry citizen science project e.g. data collection, data entry, experimentation

  4. Citizen Science in Education – is this a useful tool what are the benefits/disadvantages?

  5. How could we use citizen science in Australia as a way of teaching chemistry to children and young adults and members of the community?

A literature review was conducted to look into current research of citizen science.
• Citizen science is defined as “scientific work undertaken by members of the general public, often in collaboration with or under the direction of professional scientists and scientific institutions”.
• Citizen science projects allow scientists to gather data worldwide (primarily via the internet). Participants may come from multiple levels including individuals, teams, or networks of volunteers.
• These projects enable scientist to carry out large-scale research that are usually too expensive or time-consuming to do.
• Main areas of research include astronomy, ornithology, climatology, environmental science, marine science, etc.
• Benefits and challenges coexist. Some concerns about the validity of data due to the lack of training in the data collection process and the probability of fake data reporting.
• New statistical and high-performance computing tools have been introduced to help eliminate the sampling bias, detection, measurement error, identification, and spatial clustering.
• Apart from yielding scientific outcomes, citizen science also plays a role in education:
o Education to both the general public and students in the classroom (“Student-Scientist Partnerships programs”)
o Members of the citizen science projects would gain more knowledge in the relating field and are more likely to engage in proper actions to improve certain conditions.
o Provide opportunities for youth to apply knowledge from the classroom in a real-world setting

Readings were done to identify barriers/ challenges of project implementation. (Kobori et al., 2016)
Problems

• It is important for a CS project to both attract appropriate volunteers and retain their participation over the period of research.
• Participation may follow a “Zipf curve” pattern, where there is uneven contribution of data from a small group of participants.
• In Japan, two major challenges are
o A lack of young volunteers and the loss of older volunteers
o A lack of staff and funding in the organisation

Solutions
• To attract more volunteers: take social factors into account, identify the motivations of the participants (e.g. via preliminary surveys), implementing strategies such as competitions and reward.
• To retain more volunteers: provide up-to-date feedback; reward; establishing social bond within the volunteer group
• Build in features that are attractive to volunteers e.g. eBird project, participants are able to access the crowdsourced data and find more birds they like.
• Need to take cultural difference into account when designing the program: Eastern culture emphasizes collective responsibility while western culture emphasizes individual responsibility and achievement.
• Implementing evaluation and adaptive project management strategies: tools such as logic models, front-end evaluation, formative evaluation, summative evaluation.

There are concerns when using CS as a primary research tool:
• Preference to use scientists’ data; data distrust; low awareness of the availability from CS; low suitability of the data sets (Burgess et al., 2017).
• One way to optimise the suitability is for scientists to selectively choose the data sets that fit into their research topics (Bonney et al., 2014).
• New technologies such as deep machine learning to help analyse the data and build better models (Kobori et al., 2016).
• Increase transparency and availability of methods and data attributes (Burgess et al., 2017).

The Royal Society of Chemistry in the UK has hosted a number of global citizen science projects since 2012, aiming to improve students’ understanding in chemistry and develop their interests in science:

  1. Chemistry in sport: a global experiment comparing sports drinks made by students and water (https://edu.rsc.org/resources/chemistry-in-sport/2084.article)
  2. Measuring vitamin C in food (https://edu.rsc.org/resources/measuring-vitamin-c-in-food/1280.article)
  3. The art of crystallisation (https://edu.rsc.org/resources/the-art-of-crystallisation/1379.article)
  4. Water and hydrogels (https://edu.rsc.org/resources/water-and-hydrogels/1703.article)
  5. Mission starlight (https://edu.rsc.org/resources/mission-starlight/2073.article)

@lq33shan
Copy link
Author

lq33shan commented Dec 5, 2019

05/12/2019

  • Research on CS in education:

Apart from serving a research purpose, citizen science (CS) projects can also serve an educational purpose. Because of the particularity of CS projects, they serve as a good way to advance the scientific literacy of the general public. However, studies adopting CS in conservation biology showed that although participants gained more knowledge in the related field, there was neither significant change in behaviours nor improvement in understanding of the nature of science. This might be partially explained by discrepancies in educational backgrounds and a lack of deep learning process [1].

On the other hand, CS can be utilised in the classroom to provide students hands-on experience with real-world research. Science education before university can be very static. Laboratory exercises are often designed in a way that students are asked to follow the given procedures and reach a uniformed conclusion [2]. The traditional ways of teaching may limit students’ ability to explore science themselves and therefore put constraints on their ability to inquire and imagine. As CS projects are usually ongoing research and have no definitive conclusions to reach, they allow young participants to engage in science activities with a more proactive attitude and encourage them to “think outside of the box” [2].

Many successful cases have been reported, most of which investigate biodiversity, phenology, and ornithology. For example, a biodiversity CS project was conducted in Vienna and Austria where 428 students aged 8 to 18 monitored biodiversity in their local environment for two years [3]. Pre- and post-study surveys were delivered to evaluate the cognitive, affective and behavioural changes in students. While results showed an increase in interest/motivation/mastery/positive attitude towards wild animals among all students, primary students rated themselves highest in most scales and contributed most database entries. However, the high self-efficacy scores did not correlate with better data quality, which should be bear in mind by both the teachers and scientists.

Despite most CS projects are ‘contributory’, meaning they only require simple data collection process, some of the projects in higher education also encourage data analysis by the participants. The University of Western Australia introduced a phenology CS project ClimateWatch as part of a biology class from 2012 to 2014 [4]. First-year students were asked to report sightings of local species and to write scientific articles accordingly as part of their course. Similarly, university students reported higher levels of learning and increased awareness of environmental issues as well as improved research skills after the project was done. Nevertheless, 85% of them doubted the reliability of the submitted data, with only 40% agreed on the reliability of data from CS projects. Falsification might arise as a problem when CS projects were compulsory as part of the university courses.

For a CS project to be effectively implemented in educational settings, both research objectives and learning outcomes need to be considered [3]. To improve the quality of the data, it is critical for teachers to be trained with background knowledge/skills and have appropriate toolkits before acting as instructors of the projects [2]. Additionally, involvement of scientists that are active in the research field can not only provide up-to-date information but also act as role models to facilitate students’ learning motivation. A CS project must be carefully designed in order to meet both research and education requirements. Collaboration between researchers and educators is therefore beneficial [3].

Evaluation of the social impacts of a CS project is also important but often neglected. Implementing CS projects in high schools would improve scientific competency for potential university candidates. Results from the CS projects could feedback to education systems to guide changes in curriculum or to inspire reform strategies. CS projects would also have impact on social capital and influence attitude of science from the general public etc. [2].

References

[1] R. C. Jordan, S. A. Gray, D. V. Howe, W. R. Brooks, and J. G. Ehrenfeld, “Knowledge Gain and Behavioral Change in Citizen-Science Programs,” Conserv. Biol., vol. 25, no. 6, pp. 1148–1154, Dec. 2011.
[2] H. R. Shah and L. R. Martinez, “Current Approaches in Implementing Citizen Science in the Classroom,” J. Microbiol. Biol. Educ., vol. 17, no. 1, p. 17, Mar. 2016.
[3] J. Kelemen-Finan, M. Scheuch, and S. Winter, “Contributions from citizen science to science education: an examination of a biodiversity citizen science project with schools in Central Europe,” Int. J. Sci. Educ., vol. 40, no. 17, pp. 2078–2098, Nov. 2018.
[4] N. Mitchell, M. Triska, A. Liberatore, L. Ashcroft, R. Weatherill, and N. Longnecker, “Benefits and challenges of incorporating citizen science into university education,” PLOS ONE, vol. 12, no. 11, p. e0186285, Nov. 2017.

  • Readings on Breaking Good project

  • Start formulating research questions

@lq33shan lq33shan removed their assignment Dec 5, 2019
@lq33shan
Copy link
Author

lq33shan commented Dec 6, 2019

A widely accepted categorisation of Public Participation in Scientific Research (PPSR) by Rick et al. (2009) applies to CS projects as well [1]:

  1. Contributory projects, which are generally designed by scientists and for which members of the public primarily contribute data.
  2. Collaborative projects, which are generally designed by scientists and for which members of the public contribute data but also may help to refine project design, analyse data, or disseminate findings.
  3. Co-created projects, which are designed by scientists and members of the public working together and for which at least some of the public participants are actively involved in most or all steps of the scientific process.

Chemistry is the study of matters, substrates and reactions that take place in everyday life.

  • umbrella term
  • embedded in all divisions of science
  • hard to identify projects that specifically designed to deliver chemistry knowledge
  • some biology/environment CS projects can also be classified as chemistry projects

To develop CS projects that speak to Australian context, we need to first understand current situations of chemistry education in Australia.

Australian Senior Secondary curriculum for Chemistry consists of 4 units [2]:

  1. Chemical fundamentals: structure, properties and reactions
  2. Molecular interactions and reactions
  3. Equilibrium, acids and redox reactions
  4. Structure, synthesis and design

Australian Council for Educational Research has published a review in 2018 to address challenges in STEM education [3].

  1. Improve student outcomes in STEM
  2. Build the STEM teacher workforce
  3. Rethink the STEM curriculum

Implementing CS projects can serve as an effective solution to all these challenges, as they encourage students interest as well as enhance science inquiry and literacy. Teachers would also benefit from CS projects as they are more able to get in touch with distinguished scholars to enrich their knowledge.

Meeting with Alice:

  • Compared to astronomy or biology CS projects, which make up a majority of the CS projects, chemistry projects may require experimental equipment’s that are often unavailable for the general public or school kids. Alongside, the knowledge/skills requirements can also be a barrier for implementing chemistry CS projects.
  • Australia is a multicultural country. It is important to understand how cultural differences may influence CS project participation/ outcomes.
  • Plans for the next two days:
    o Try to develop some questions for pre-/post- survey in CS projects (evidence-based).
    o Review publications to find out more about cultural differences in CS projects.

Follett and Strezove, 2015: An Analysis of CS based Research: Usage and Publication Patterns [4]
Method to review articles on CS.

  1. Export all records using “citizen science” as the key word, combined lists from Web of Science and Scopus together
  2. Removed duplicates and falsely defined ones to produce the ‘master list”
  3. Examine the titles and abstracts to identify papers that described specific CS projects. Papers were categorised to different focus areas, such as the theory, methodology, validation techniques etc..
  4. Categorisation of the CS projects into 5 types: action, conservation, investigation, virtual, education
  5. Classification of CS projects into different subjects: astronomy, environment, biology, medical, other

Getting started on a publication review inspired by [4]:

  • Two databases: Web of Science & Scopus
  • The first publication related to CS came out in 1997 [4]
  • Topic = citizen science, time refined between 1997 to 2019
    • Web of Science: 3,526 results
    • Scopes: 3,925 results

References:
[1] R. Bonney et al., Public Participation in Scientific Research: Defining the Field and Assessing Its Potential for Informal Science Education. A CAISE Inquiry Group Report. 2009.
[2] Australian Curriculum, Assessment and Reporting Authority (ACARA), “Chemistry.” [Online]. Available: https://www.australiancurriculum.edu.au/senior-secondary-curriculum/science/chemistry/. [Accessed: 06-Dec-2019].
[3] M. J. Timms, K. Moyle, P. R. Weldon, P. Mitchell, and Australian Council for Educational Research (ACER), Challenges in STEM learning in Australian schools: literature and policy review. 2018.
[4] R. Follett and V. Strezov, “An Analysis of Citizen Science Based Research: Usage and Publication Patterns,” PLOS ONE, vol. 10, no. 11, p. e0143687, Nov. 2015.

@lq33shan
Copy link
Author

lq33shan commented Dec 9, 2019

09/12/2019

Thoughts during the weekend:
Research question: To develop a standardised set of questions that include multi-level outcome evaluation, which is translatable to CS research in all disciplines

Global Distribution of Research
1. Data from Scopus
Total = 3,925 results
Top 3: US = 1675; UK = 803; Australia = 324
image
To have a better visualisation of the other countries, data from the US and UK were excluded from the following map:
image
2. Data from Web of Science
Total = 3,526 results
Top3: US = 1574; UK = 678; Australia = 302
image
To have a better visualisation, data from the US and UK were excluded from the following map:
image
Overall, the US is contributing the majority of the CS-related research, while Europe is taking the second place. Australia is also a pioneer in CS research, likely to be the third largest source of CS research worldwide. However, the two databases are predominately selecting papers from western world so bias could be resulted.

To improve outcomes of CS projects, we need to first define what the good/bad outcomes are, i.e. understand the evaluation of CS projects.
Some examples of ways of evaluation in existing literature:

1 Scientific knowledge and attitude change: The impact of a citizen science project (Cited by 582) [1]

  • Employed existing instrument in the literature
  • Examined knowledge and attitude change using the items listing below:

A. Attitude toward science scale items:

  • Science and technology are making our lives healthier, easier, and more comfortable.
  • The benefits of science are greater than any harmful effects.
  • Science makes our way of life move too fast.
  • We depend too much of science and not enough on faith.

B. NEP/humans-with nature subscale items:

  • Humans were created to rule over the rest of nature.
  • People have the right to modify the natural environment to suit their needs.
  • Plants and animals exist primarily to be used by people.
  • People need not to adapt to the natural environment because they can remake it
    to suit their needs.

C. Understanding of the scientific process items:

  • When you hear or read the term ‘scientific study’ do you have (please check one):
  • a clear understanding of what it means
  • a general sense of what it means
  • little understanding of what it means
  • If you checked a) or b) for the previous question, please tell us in your own words
    what it means to study something scientifically:

D. Bird knowledge scale items:

  • Most songbirds lay one egg per day during the breeding season.

  • Clutch size refers to the number of eggs a female bird can fit in her nest.

  • All birds line their nest with feathers.

  • Humans can handle nestlings with little fear of the nest being abandoned by the
    adult birds.

  • The age of a female bird can influence the number of eggs she lays.

  • Some birds need supplemental calcium to produce eggs.

  • Most cavity-nesting birds eat primarily seeds.

  • Cavity-nesting species that use nest boxes are safe from predators.

  • Some species of warblers use nest boxes.

  • Nest boxes should never be made of pressure-treated wood.

  • Two potential responses of each item: agree = 1; disagree = 0. Analysis based on sum of scores in each category.

  • Results: increased knowledge of bird, no change in attitude towards science or the environment or understanding of the scientific process

2 A Science Products Inventory for CS planning and Evaluation (2018) [2]

  • A science products inventory (SPI) tool, used to support “general-purpose planning and evaluation of CS projects with respect to science productivity”

  • An inventory of science productivity indicators developed by a panel of experts, through 2 workshops held 6 months apart.

  • Assesses scientific progress or products and contextual details on data practices
    o Assessment categories include: written product, data, management and communication
    o Also whether data are reusable and to what extent it’s accessible for the general public or researchers.

  • See attachment “SPI-form”SPI_Form.pdf

3 Survey report: data management in Citizen Science projects (2016) [3]

  • The management of data was identified as a major barrier to the re-usability and further integration of data.
  • Surveyed for 121 projects, questionnaire included both open-ended question and multiple choices questions

General information about the project

  1. Which topic areas does your project cover? *
    Space science
    Earth science
    Environmental science
    Social science
    Other

  2. Which geographic extent does the project cover?
    image

  3. What is the planned duration of the project?
    < 1 year
    1-4 years
    '> 4 years
    I do not know

  4. What is the status of this particular project?
    Experimental (i.e. currently testing ideas and new techniques)
    Operational (i.e. currently in use or ready for use)

  5. How is this citizen science project financed? *
    In kind contributions
    Private donations
    Participant or membership fees
    National funding scheme
    European Commission (FP7 or Horizon 2020)
    Other

  6. Does your project include an explicit data management plan?
    Yes
    No

Discoverability of Citizen Science data

  1. Is the data and all associated metadata from your project discoverable through catalogues and search engines?
    Yes
    No

  2. Does the data contain persistent, unique, and resolvable identifiers?
    Yes
    No

Accessibility of Citizen Science data

  1. Do you provide access to raw data sets or aggregated values? *
    Raw
    Aggregated

9.1 Do the raw data sets include information about the data producer (e.g. user names,
individual identifiers, or locations)?
Yes
No

  1. How do you manage informed consent?
    Not at all
    Through a generic terms of use agreement
    Through signature of an informed consent form
    Other

  2. Is the data collected by the project accessible via an online services for visualization
    (e.g. by OGC Web Map Service - WMS)?
    Yes
    No

  3. If the data collected by the project is accessible via an online download service, how
    can it be accessed?
    Bulk download of all data at once (e.g. by FTP download)
    User-customizable download of selected parts (e.g. OGC Web Feature Service - with
    Filter Encoding)
    User-customizable services for computation (e.g. by a data processing API)

Re-use conditions of Citizen Science data

  1. Do you make the data available for re-use? *
    Yes (directly after data acquisition/production)
    Yes (in a staged approach, i.e. after an embargo period)
    No

  2. Why did you take this decision?
    Usability of Citizen Science data

  3. Is the data collected by the project quality-controlled? *
    Yes
    No

  4. Is the data collected by the project structured based on international or community approved (non-proprietary) standards?
    Yes
    No

  5. Is the data collected by the project comprehensively documented, including all elements necessary to access, use, understand, and process, preferably via formal structured metadata?
    Yes
    No

  6. Are you providing a dedicated contact point for the data collected by the project?
    Yes
    No

  7. Are the data access and use conditions, including licenses, clearly indicated?
    Yes
    No

Preservation and curation of Citizen Science data

  1. For how long do you ensure the access to the data from you project? *
    During the lifetime of project
    Beyond the life-lime of the project

  2. How do you preserve the data from your project?
    Local machine of a project member
    Remote server hosted by a project member
    Research library
    Public repository
    Other

  3. In which country/countries do you store the data?

Additional information

If you would like to share the name of your Citizen Science project for future reference,
please use the box below.

If you would be so kind to share the web page of your project, please use the box below.

If you allow us to come back to you after the survey, please leave your e-mail address
below. This information will not be further distributed, used as part of the analysis or
made available with the survey results.

If you have any additional information to share with us, please feel free to use the
text box below.

4 Key issues and new approaches for evaluating citizen-science learning outcomes [4]

  • The evaluation of learning (the cognitive, affective and behavioural dimensions)/ the individual learning outcomes
  • LOs should be specific, measurable, attainable, relevant and timely.
  • A balance between scientific outcomes and learning outcomes should be achieved.
  • If learning is a priority goal, activities should be designed to fulfil learning outcomes explicitly.
  • A more comprehensive approach should include multilevel assessment: individual, programmatic and community level.
    image

References
[1] D. Brossard, B. Lewenstein, and R. Bonney, “Scientific knowledge and attitude change: The impact of a citizen science project,” Int. J. Sci. Educ., vol. 27, no. 9, pp. 1099–1121, Jan. 2005.
[2] A. Wiggins, R. Bonney, G. LeBuhn, J. K. Parrish, and J. F. Weltzin, “A Science Products Inventory for Citizen-Science Planning and Evaluation,” BioScience, vol. 68, no. 6, pp. 436–444, Jun. 2018.
[3] S. Schade, C. Tsinaraki, European Commission, and Joint Research Centre, Survey report: data management in citizen science projects. Luxembourg: Publications Office, 2016.
[4] R. C. Jordan, H. L. Ballard, and T. B. Phillips, “Key issues and new approaches for evaluating citizen-science learning outcomes,” Front. Ecol. Environ., vol. 10, no. 6, pp. 307–309, Aug. 2012.

@lq33shan
Copy link
Author

lq33shan commented Dec 10, 2019

10/12/2019

Looked into several CS projects online:

E.g. Foldit: solve puzzle of protein folding (find the best configuration that eliminate the clash)

a. Interact with the given protein structure by dragging the 3D model

b. High-scoring solutions are sent and analysed by researchers to determine its scientific value

c. From my experience:
Little knowledge gain. Though participants might be very engaged with the game, the scientific knowledge they are getting from the game is very limited.
Self-generation button?: programming might be more helpful to the project?

d. Literature [V. Curtis, “Motivation to Participate in an Online Citizen Science Game: A Study of Foldit,” Sci. Commun., vol. 37, no. 6, pp. 723–746, Dec. 2015.]:

  • Interest in science and contributing to scientific research motivated most of the participants.
  • But relatively high threshold to participate, many participants are those who have a STEM qualification (undergrad or postgrad).
  • Online forum and community useful for newcomers.
  • Interactive games attractive to participants

Summarised current situation of CS:

image

Developed questionnaire to investigate current challenges of implementing chemistry citizen science in Australia

@lq33shan
Copy link
Author

lq33shan commented Dec 11, 2019

Meeting with Alice

  • Research aim: To develop a standardised set of questions that evaluates multi-level outcomes for Breaking Good project, which might also be translatable to CS projects in all disciplines.

  • Plan for the next few days:
    [1] More literature readings on the evaluation of Citizen Science project
    [2] Define "multi-level" outcomes
    [3] Advantages and disadvantages of having such a standardised framework

Literature readings

  • Evaluation should follow the progress of project implementation. Types of evaluation include: Baseline evaluation, Formative evaluation, Summative evaluation

  • Important to hear from all participants including stakeholders, volunteers, leading scientists, and other project organisers etc.

  • A 'backward research design' allows higher coherence between objectives & outcomes

  • Challenges in adopting a unified evaluation standard:

    [1] Multidisciplinary nature of citizen science project
    [2] A lack of theoretical framework
    [3] Audience diversity: cultural, economic, social, ethic, geographical
    [4] Challenge to meet scientific rigor (randomised control trials)

  • Multiple handbooks/ sources of STEM project evaluation:

    [1] User-friendly handbook for mixed method evaluation (Frechtling and Sharp 1997)

    [2] User-friendly handbook for project evaluation (Westat 2002)

    [3] Evaluation: A systematic approach (Rossi et al. 2004)

    [4] The Practical Evaluation Guide (Diamond 1999)

    [5] Framework for Evaluating Impacts of Informal Science Education Projects (Friedman 2008)
    i. Agreed-on outcomes (impact categories) of informal science education programs: awareness and understanding, engagement and interest, attitudes, skills ,behaviour, other (allows for the inclusion of emergent outcomes in unanticipated aspects of educational impacts)
    ii. Could be adopted in current framework design
    iii. Informal science education vs. formal science education: similarities and differences?

  • Multi-level outcomes: individual, scientific, community
    i. Individual: science literacy, knowledge in specific area, understanding of scientific research, skills, attitude etc.
    ii. Scientific: less important in education CS projects?
    iii. Community: students entering a higher education in science field

@lq33shan
Copy link
Author

12/12/2019

More readings into theories/framework/models of evaluation both within the field of science education and out there in other disciplines

Thoughts after reading:

1. A framework for articulating and measuring individual learning outcomes from participations in citizen science
Phillips, T., Porticella, N., Constas, M., & Bonney, R. (2018). A Framework for Articulating and Measuring Individual Learning Outcomes from Participation in Citizen Science. Citizen Science: Theory and Practice, 3(2), 3. https://doi.org/10.5334/cstp.126

  • Consensus in the current literature on what categories of ILOs should be evaluated in a CS project.
  • Creating a standardised, systematic framework of evaluation is of significant importance in order to compare CS projects and improve them.
  • Some evaluation instruments that could be helpful to practitioners : database of potential surveys and data collection instruments; sample evaluation reports from citizen science; examples of evaluation designs, guides for conducting evaluations.
  • A framework for articulating and measuring common LOs for CS:
    image
  • This paper focuses on environmental CS projects. Though CS projects in different disciplines often have similar outcome expectations, it is more ideal to have a general guideline that fits other projects as well.
  • A lot of the CS evaluation based on existing evaluation framework for Informal Science Education. What can be different when CS is implemented in classroom settings?
  • A centralised database/ guideline for CS practitioners to refer to when they develop their own evaluation process would be beneficial.
    • General information + standardised question sets?
  • Apart from ILOs, other focus of outcomes may include: scientific (for scientists), socio-economical (for the community), communication (for the proposers and reviewers) [Source: https://ecsa.citizen-science.net/sites/default/files/2017_bki_evaluation_cs.pdf]
  • Social learning theories might be helpful in building frameworks for CS projects?

2. Evaluation models

  • Evaluation models in other disciplines include:
    i) Behavioural Objectives Approach
    ii) The Four-Level Model
    iii) Responsive Evaluation
    iv) Goal-Free Evaluation
    v) Adversary/Judicial Approach
    vi) Consumer-oriented Approaches
    vii) Expertise/ Accreditation Approaches
    viii) Utilization-focused Evaluation
    ix) Participatory/Collaborative Evaluation
    x) Empowerment Evaluation
    xi) Organisational Learning
    xii) Theory-Driven Evaluation
    xiii) Success Case Method
  • Evaluation designs: Pre-experimental, Quasi-experimental, True experimental
  • Can CS evaluation benefit from these models?

@lq33shan
Copy link
Author

13/12/2019

Worked on the reading list Yaela provided and made notes.

Thoughts for today:

Standardisation of the evaluation framework has its pros and cons.

For researchers that are new to this field, a standardised framework would exempt them from creating a new process on their own. Also, standardisation would improve those projects where evaluations are not treated properly as it should be. This would reduce the chances of “false-negative” evaluation results and enhance the overall quality of CS projects. Moreover, standardisation allows for consistency between projects evaluation thus enabling comparisons of projects across the world. Having exemplar projects that are selected based on a widely accepted framework would both increase the credibility and convincingness of the selection as well as inspire other practitioners improve their own projects accordingly.

On the other hand, standardisation might also limit those who are experienced in the field of CS to bring their work to a higher level. The obligation to follow a framework might retard development of a better evaluation model as people tend to prefer the handy, standardised structures.

It is a situation faced by most standardisation process including education. To balance out the advantages and disadvantages, different levels of achievements are defined in education practice so that students may choose to follow the rubrics that match their competence. Similarly, we can develop a structured framework where outcomes are classified from “High distinction” to “Standard” so that researchers can select the one that suits their needs and empower their projects on a well-established basis.

@lq33shan
Copy link
Author

16/12/2019

Meeting with Yaela:

Survey:
• Wording: make sure everyone understands the same
• Concepts: e.g. satisfaction – enjoy?
• Testing the reliability/ validity:
o Reliability: tested with the targeted population
o Validity: tested by the experts of the field

• How exactly to develop a standardised set of questions that evaluate multi-level outcomes of CS project (which is preferably transferrable to other projects)?
• Target population: talking to undergrads who have done Breaking Good projects?
• Research design: literature review? Summarising existing questionnaires and develop tools/ resources/ question bank for CS project evaluation?
• Plan for the next few days:
o Find questionnaires in research that are evaluating individual learning outcomes of CS
o Categorise each of the questions into 6 established categories (using framework developed by Phillips et al., 2018)
o Report quantitatively & if possible, analyse the questions to find the common way of framing/ other similar features between them
• Next meeting on Thursday

Thoughts:

  • How to balance the objectivity and subjectivity in the research: not only 'stating the obvious' but also have deeper understanding?
  • What are the existing theory for CS evaluation? (Phillips et al. 2018; Bonney et al. 2009 etc.)
  • Has anyone done similar things in other areas of research (Is there an establish 'question bank' for researchers in other field of study to draw from when composing a questionnaire?)

@lq33shan
Copy link
Author

lq33shan commented Dec 17, 2019

Meeting with Alice:

  • Narrowed down the scope of the question to ‘attitude toward science’
  • Discussed the feasibility of postulated research method: literature search => coding questions => develop tools improve results
  • Ways of coding: Self-report vs observational? Implicit measure vs explicit measure?
  • What are the current theories of attitude change in the field of science communication as well as in other disciplines?
  • Next meeting on Friday

Research in Attitude toward science

Brossard et al. (2005) applied a psychological theory, the Elaboration Likelihood Model in their research to investigate changes in attitude toward science during a citizen science project. The theory assumes that “thoughtful attention to stimuli will activate a central or main route to persuasion.” Thus, they hypothesised that participation in CS project would result in positive attitude change. They used a modified version of a well-documented instrument: the attitude toward organised science scale (ATOSS) developed by the National Science Foundation of the US, to measure attitude toward science. Modification was only made in a way that participants were allowed a larger range of response (strongly disagree – strongly agree) instead of a yes/no answer in the original ones. However, they also argued that the data they draw from their pilot study showed a low reliability of this scale and no data could be found in the literature regarding discriminant and convergent validity.
4 items were used in the pre-/post-project surveys:

  • Science and technology are making our lives healthier, easier, and more comfortable.
  • The benefits of science are greater than any harmful effects.
  • Science makes our way of life move too fast.
  • We depend too much of science and not enough on faith.

Their results showed no change of attitude toward science. Two explanations were given. The first one doubted whether the ‘central route to persuasion’ was really activated among participants, indicating that participants might not have engaged in a thoughtful process. Secondly, they argued that the attitude change could be much complex than they expected, where the 4 items in the instrument might not be sensitive enough to measure the attitude change.

Misiti et al. (1991) developed a 23-item Likert scale to test science attitude in middle school students. They focused on 5 aspects (“subcomponents”) of attitude towards science which are:

  • Using science materials to do science activities (investigative processes).
  • Perceived comfort or discomfort related to classroom science.
  • Learning science content.
  • Reading or talking about science-related topics.
  • Viewing science programs on films or TV.

The authors stated that they embedded attitude object (i.e. science) either explicitly or implicitly within each statement. A trial was run where the coefficient alpha index of reliability (0.98) and the estimated average inter-item correlation r-value (0.32) was tested. The final set of statements was then selected using a number of criteria.

  • Getting science books from the library is a drag
  • I hate to keep records of science experiments in a notebook
  • Science films bore me to death
  • I wish science class lasted all day
  • I dislike watching science specials on television
  • I hate science class
  • Learning science facts is a drag
  • Working with science equipment makes me feel important
  • I would like to join a science club that meets after school
  • Looking through a microscope is not my idea of fun
  • Knowing science facts makes me feel good
  • I don’t mind doing an experiment several times to check the answer
  • I feel like daydreaming during science class
  • Sharing science facts that I know makes me feel great
  • I hate to study science out of doors
  • It’s neat to talk to my parents about science
  • I like to make science drawing
  • I wouldn’t think of discussing science with friends outside of class
  • I enjoy using mathematics in science experiments
  • I cannot wait until science class
  • I wish we didn’t have science class so often
  • Doing science projects at home is dumb
  • Science is one of my favourite classes

Further tests indicated that the scale yielded high internal consistency, evaluative quality, known groups validity and cross-cultural validity.
Note: Though the authors claimed to have ‘embedded attitude object implicitly or explicitly in the statement’, it is doubtful whether implicit attitude toward science can be measured since ‘science’ was explicitly present in each statement.

@lq33shan
Copy link
Author

lq33shan commented Dec 18, 2019

18/12/2019

Questions summary
Brossard et al. (2005) Likert scale [1]

  • Science and technology are making our lives healthier, easier, and more comfortable.
  • The benefits of science are greater than any harmful effects.
  • Science makes our way of life move too fast.
  • We depend too much of science and not enough on faith.

Misiti et al. (1991) Likert scale [2]
From 5 subcomponents:

  • Using science materials to do science activities (investigative processes).
  • Perceived comfort or discomfort related to classroom science.
  • Learning science content.
  • Reading or talking about science-related topics.
  • Viewing science programs on films or TV.

23 statements, 5-point Likert scale:

  • Getting science books from the library is a drag
  • I hate to keep records of science experiments in a notebook
  • Science films bore me to death
  • I wish science class lasted all day
  • I dislike watching science specials on television
  • I hate science class
  • Learning science facts is a drag
  • Working with science equipment makes me feel important
  • I would like to join a science club that meets after school
  • Looking through a microscope is not my idea of fun
  • Knowing science facts makes me feel good
  • I don’t mind doing an experiment several times to check the answer
  • I feel like daydreaming during science class
  • Sharing science facts that I know makes me feel great
  • I hate to study science out of doors
  • It’s neat to talk to my parents about science
  • I like to make science drawing
  • I wouldn’t think of discussing science with friends outside of class
  • I enjoy using mathematics in science experiments
  • I cannot wait until science class
  • I wish we didn’t have science class so often
  • Doing science projects at home is dumb
  • Science is one of my favourite classes

Friday Institute for Education Innovation (2012) 5 items Likert [3]

  • I am sure of myself when I do science.
  • I would consider a career in science
  • I expect to use science when I get out of school
  • Knowing science will help me earn a living
  • I will need science for my future work
  • I know i can do well in science
  • Science will be important to me in my life’s work
  • I can handle most subjects well, but I cannot do a good job with science
  • I am sure I could do advanced work in science
    Yildirim & Selvi (2015) applied the above scale in Turkish

Ornstein (2006) Likert 5-point scale [4]

  • I do not want to study any more science
  • I feel overwhelmed in science classes
  • I would or do belong to a science-related club.
  • I do not enjoy doing labs in my science class.
  • I like to share what I’ve learned in science class with my friends or family.
  • Science seems to be “over my head.”
  • I enjoy science classes.
  • I do not think about the things I learn in science class outside of school.
  • I enjoy reading books about science.
  • I do not enjoy participating in hands-on activities in my science class.
  • I would not try to learn about science on my own.
  • I have the ability to be successful in science classes.
  • I like going to my science class because I learn interesting things.
  • Learning things in science is easy for me.
  • I do not enjoy watching TV shows that deal with science.
  • I am able to easily understand topics in science.
  • I enjoy reading about science in the newspaper or magazines.
  • I do not enjoy talking about science with my friends.

Denessen et al. 2011 [5]

Explicit measures: Teachers were asked to rate their agreement with the statements on a four-point Likert scale.
(1) Teachers’ enjoyment for teaching science and technology (5 items, for example ‘I feel comfortable when teaching science and technology’, Cronbach’s alpha = .84),
(2) their perceived relevance of teaching science and technology (5 items, for example ‘I think it is important to teach science and technology’, Cronbach’s alpha = .80),
(3) their perceived levels of competence and efficacy related to teaching science and technology (5 items, for example ‘For me teaching science and technology is very difficult’, Cronbach’s alpha = .70),
(4) their motivation to invest in their science and technology teaching competences (5 items, for example ‘I am willing to invest in the development of my abilities to teach science and technology’, Cronbach’s alpha = .85).

Implicit measures: single-target IATs. (instead of a relative measure between opposing category)
2 respond keys = A/L, correlating to 2 categories
2 set blocks, 1st = Left side of the screen contains “negative” and “science and technology”; right side of the screen contain “positive”; 2nd = “negative” on the one side, “positive” + “science and technology” on the other

References:
[1] D. Brossard, B. Lewenstein, and R. Bonney, “Scientific knowledge and attitude change: The impact of a citizen science project,” Int. J. Sci. Educ., vol. 27, no. 9, pp. 1099–1121, Jan. 2005.
[2] F. L. Misiti, R. L. Shrigley, and L. Hanson, “Science attitude scale for middle school students,” Sci. Educ., vol. 75, no. 5, pp. 525–540, 1991.
[3] “Student Attitudes toward STEM: The Development of Upper Elementary School and Middle/High School Student Surveys.” .
[4] A. Ornstein, “The Frequency of Hands-On Experimentation and Student Attitudes Toward Science: A Statistically Significant Relation (2005-51-Ornstein),” J. Sci. Educ. Technol., vol. 15, no. 3–4, pp. 285–297, Oct. 2006.
[5] E. Denessen, N. Vos, T. Damen, S. Koch, M. Louws, and D. Wigboldus, “Explicit and Implicit Measures of Teacher Attitudes Towards Science and Technology,” in Professional Development for Primary Teachers in Science and Technology: The Dutch VTB-Pro Project in an International Perspective, M. J. de Vries, H. van Kuelen, S. Peters, and J. W. van der Molen, Eds. Rotterdam: SensePublishers, 2011, pp. 107–119.

@lq33shan
Copy link
Author

lq33shan commented Dec 19, 2019

19/12/2019

Research interest:

Attitude toward science is often measured with self-reported Liker scales in the research. Survey are conducted between different groups or within the same group of participates at different times. The development of such kind of scales is often heavily dependent on experts’ opinions and experiences.
For example, Misiti et al. (1991) sought advice from 8 former elementary science teachers to rank 15 ‘subcomponents’ of attitude toward science and employed the top 5 subcomponents (i.e. using science materials to do science activities; perceived comfort or discomfort related to classroom service; learning science content; reading or talking about science-related topics; viewing science programs on films or TV) as the basis of the scale. Trial statements under each subcomponent were then drafted and disseminated for a pilot study. A selection process took place after the trails where only those meeting a number of criteria were included in the final scale so that reliability and validity were enhanced.
Drawing experts’ opinions in the scale development process is important as experts are often direct practitioners in the relating field and have more insight to the situations. Evaluations from experts’ perspectives would enable researchers to match the theoretical concepts with practical experience and minimise the gaps of understanding between participants and themselves.

However, it can also be a problem that the researchers rely too much on expert opinions and rarely develop the tools of measurement base on a theoretical construct. Some of the papers measuring attitude change towards science only gave a vague definition of attitude in the introduction and do not elaborate further. The lack of conceptual basis could result in incomprehensive understanding of the problem, which would further lead to biases in the tool.
The ABC model of attitude is a famous model commonly used in psychological research to understand people’s attitudes toward some attitude objects. The A, B, C stand for Affective, Behavioural and Cognitive component, respectively. The model proposes that attitudes comprise of these three components and it can explain the source of one’s attitude toward an attitude object. (Will explain this further next time)
Therefore, current research will adopt this model to evaluate the tools for attitude reporting in the field of citizen science.

Meeting with Yaela

  • Training on general information about Human Ethics
  • Discussed the general process of research
    o Literature searching (gather science attitude scales in the main databases)
    o Development of a codebook to analyse the tools
    o Evaluation of the tools (within project)
    o Between-project comparison
    o Recommendations and suggestions
  • Next meeting Tuesday

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants