Denial101x
Denial101x
I enrolled in Denial101x, partly to see what MOOCs are all about and partly to see what Cook and co were up to now.
Production values are high. The videos are slick, and well-integrated with surveys, discussion forums and quizzes to test how well you understood the material.
The contents are poor. The title is a start. "Denial" is a deliberate provocation, used to reinforce the tribal identities that polarize the climate debate.
The opening survey was full of leading questions, and the accompanying quizzes are not much better. As an example, students are asked to pass judgement on parents who choose against vaccination. Little information is giving about the situation. Parents may opt out of vaccination because they are confused about its pros and cons, or because vaccination conflicts with their religious beliefs, or because they have superior knowledge of their child's medical condition. Who knows? However, students of this course are led to believe that anti-vaxxers are like Holocaust deniers.
As another example, students are asked to judge the size of the urban heat island effect, again without providing much context. The world has warmed by some 0.8K on average since the start of the industrial revolution. Cities are 2-4K warmer than the surrounding countryside. Is that large or small? Cities occupy only a tiny fraction of the planet's surface, so if area is your frame of reference, the urban heat island effect is not that important. On the other hand, more than half of all people live in cities, so if population is your frame of reference, the urban heat island effect is a lot bigger than greenhouse warming. And, as thermometers tend to be where people are, the urban heat island effect is quite an important factor when homogenizing temperature records. Such nuances, so common in a university education, are absent in Denial101x.
I guess this comes as no surprise as the team behind Denial101x is the team behind SkepticalScience, that is, a bunch of none-too-bright rabid environmentalists.
It is a surprise, at least to me, that the University of Queensland and edX lend their name to this. I inquired with edX, and they confirm that quality control is not their thing. Indeed, they are not even bothered when lecturers seem unsure about their qualifications and employers. The course leader does not have a PhD, and in the first two weeks we saw a climatologist lecture in psychology and a chemist lecture in climatology. A bad advertisement for MOOCs.
As an aside, after my course on Climate Economics, one of the students asked me where I stood on the politics of it all.
Production values are high. The videos are slick, and well-integrated with surveys, discussion forums and quizzes to test how well you understood the material.
The contents are poor. The title is a start. "Denial" is a deliberate provocation, used to reinforce the tribal identities that polarize the climate debate.
The opening survey was full of leading questions, and the accompanying quizzes are not much better. As an example, students are asked to pass judgement on parents who choose against vaccination. Little information is giving about the situation. Parents may opt out of vaccination because they are confused about its pros and cons, or because vaccination conflicts with their religious beliefs, or because they have superior knowledge of their child's medical condition. Who knows? However, students of this course are led to believe that anti-vaxxers are like Holocaust deniers.
As another example, students are asked to judge the size of the urban heat island effect, again without providing much context. The world has warmed by some 0.8K on average since the start of the industrial revolution. Cities are 2-4K warmer than the surrounding countryside. Is that large or small? Cities occupy only a tiny fraction of the planet's surface, so if area is your frame of reference, the urban heat island effect is not that important. On the other hand, more than half of all people live in cities, so if population is your frame of reference, the urban heat island effect is a lot bigger than greenhouse warming. And, as thermometers tend to be where people are, the urban heat island effect is quite an important factor when homogenizing temperature records. Such nuances, so common in a university education, are absent in Denial101x.
I guess this comes as no surprise as the team behind Denial101x is the team behind SkepticalScience, that is, a bunch of none-too-bright rabid environmentalists.
It is a surprise, at least to me, that the University of Queensland and edX lend their name to this. I inquired with edX, and they confirm that quality control is not their thing. Indeed, they are not even bothered when lecturers seem unsure about their qualifications and employers. The course leader does not have a PhD, and in the first two weeks we saw a climatologist lecture in psychology and a chemist lecture in climatology. A bad advertisement for MOOCs.
As an aside, after my course on Climate Economics, one of the students asked me where I stood on the politics of it all.
PPPS: Cook's missing papers
A full reconstruction of Cook's 97% nonsensus is still lacking. However, Sou of Bundanga may have unraveled one further mystery.
In the data that Cook made available, abstract IDs run from 1 to 12,876.
The paper says that 12,465 abstracts were downloaded, of which 11,944
were used.* So, 411 abstracts are unaccounted for. 411 is a large number
relative to the number of papers that drive the alleged consensus, but
not knowing why the 411 were missing, I only included a puzzled footnote
in my Energy Policy paper.
Here is an explanation. Cook downloaded the abstracts from the Web of
Science in two batches. The first batch was the largest. After they had
rated all that, there were a good few more recent publications, so a second batch of abstracts was downloaded.
There was overlap between the first and second batch, and 411 duplicates were removed. So far, so uncontroversial.
However, if Sou is to be believed, duplicates were removed from the
FIRST batch, already rated, rather than from the second batch.
The missing abstracts are indeed disproportionally concentrated among
the lower IDs, which is consistent as the default data dump from the Web
of Science presents the more recent papers first, and more recent
papers are much more likely to overlap in Cook's two data dumps.
By removing already rated abstracts, Cook created more work and denied an opportunity to test data quality.
UPDATE
Sou offers an alternative explanation. Apparently, Cook queried WoS and downloaded the data in chunks. Some chunks were downloaded twice, or perhaps they were uploaded twice into Cook's database. For this explanation to work, we would have to believe that the data chunks were as small as 342 abstracts, or even 63 abstracts, or maybe even 4. Recall that Cook had 12,000+ abstracts.
Alternatively, Cook may have split his query, e.g. by discipline. This would lead to sizable data chunks, but the pattern of overlap would be random. Cook's overlaps are concentrated: According to Sou, the missing IDs are:
- IDs 5 to 346 inclusive = 342
- IDs 1001 to 1004 inclusive = 4
- IDs 2066 to 2128 inclusive. = 63
- Total = 409 - the other two are probably isolated somewhere.
I find all this implausible. If true, it would explain why my query returns 13,431 papers, but Cook has only 12,465 papers in his data: Some data chunks were downloaded but not uploaded. well-integrated with surveys, discussion forums and quizzes to test how well you understood the material.
The contents are poor. The title is a start. "Denial" is a deliberate provocation, used to reinforce the tribal identities that polarize the climate debate.
The opening survey was full of leading questions, and the accompanying quizzes are not much better. As an example, students are asked to pass judgement on parents who choose against vaccination. Little information is giving about the situation. Parents may opt out of vaccination because they are confused about its pros and cons, or because vaccination conflicts with their religious beliefs, or because they have superior knowledge of their child's medical condition. Who knows? However, students of this course are led to believe that anti-vaxxers are like Holocaust deniers.
As another example, students are asked to judge the size of the urban heat island effect, again without providing much context. The world has warmed by some 0.8K on average since the start of the industrial revolution. Cities are 2-4K warmer than the surrounding countryside. Is that large or small? Cities occupy only a tiny fraction of the planet's surface, so if area is your frame of reference, the urban heat island effect is not that important. On the other hand, more than half of all people live in cities, so if population is your frame of reference, the urban heat island effect is a lot bigger than greenhouse warming. And, as thermometers tend to be where people are, the urban heat island effect is quite an important factor when homogenizing temperature records. Such nuances, so common in a university education, are absent in Denial101x.
I guess this comes as no surprise as the team behind Denial101x is the team behind SkepticalScience, that is, a bunch of none-too-bright rabid environmentalists.
It is a surprise, at least to me, that the University of Queensland and edX lend their name to this. I inquired with edX, and they confirm that quality control is not their thing. Indeed, they are not even bothered when lecturers seem unsure about their qualifications and employers. The course leader does not have a PhD, and in the first two weeks we saw a climatologist lecture in psychology and a chemist lecture in climatology. A bad advertisement for MOOCs.
As an aside, after my course on Climate Economics, one of the students asked me where I stood on the politics of it all.
In the data that Cook made available, abstract IDs run from 1 to 12,876.
The paper says that 12,465 abstracts were downloaded, of which 11,944
were used.* So, 411 abstracts are unaccounted for. 411 is a large number
relative to the number of papers that drive the alleged consensus, but
not knowing why the 411 were missing, I only included a puzzled footnote
in my Energy Policy paper.
Here is an explanation. Cook downloaded the abstracts from the Web of
Science in two batches. The first batch was the largest. After they had
rated all that, there were a good few more recent publications, so a second batch of abstracts was downloaded.
There was overlap between the first and second batch, and 411 duplicates were removed. So far, so uncontroversial.
However, if Sou is to be believed, duplicates were removed from the
FIRST batch, already rated, rather than from the second batch.
The missing abstracts are indeed disproportionally concentrated among
the lower IDs, which is consistent as the default data dump from the Web
of Science presents the more recent papers first, and more recent
papers are much more likely to overlap in Cook's two data dumps.
By removing already rated abstracts, Cook created more work and denied an opportunity to test data quality.
UPDATE
Sou offers an alternative explanation. Apparently, Cook queried WoS and downloaded the data in chunks. Some chunks were downloaded twice, or perhaps they were uploaded twice into Cook's database. For this explanation to work, we would have to believe that the data chunks were as small as 342 abstracts, or even 63 abstracts, or maybe even 4. Recall that Cook had 12,000+ abstracts.
Alternatively, Cook may have split his query, e.g. by discipline. This would lead to sizable data chunks, but the pattern of overlap would be random. Cook's overlaps are concentrated: According to Sou, the missing IDs are:
- IDs 5 to 346 inclusive = 342
- IDs 1001 to 1004 inclusive = 4
- IDs 2066 to 2128 inclusive. = 63
- Total = 409 - the other two are probably isolated somewhere.
I find all this implausible. If true, it would explain why my query returns 13,431 papers, but Cook has only 12,465 papers in his data: Some data chunks were downloaded but not uploaded.
I enrolled in Denial101x, partly to see what MOOCs are all about and partly to see what Cook and co were up to now.
Production values are high. The videos are slick, and well-integrated with surveys, discussion forums and quizzes to test how well you understood the material.
The contents are poor. The title is a start. "Denial" is a deliberate provocation, used to reinforce the tribal identities that polarize the climate debate.
The opening survey was full of leading questions, and the accompanying quizzes are not much better. As an example, students are asked to pass judgement on parents who choose against vaccination. Little information is giving about the situation. Parents may opt out of vaccination because they are confused about its pros and cons, or because vaccination conflicts with their religious beliefs, or because they have superior knowledge of their child's medical condition. Who knows? However, students of this course are led to believe that anti-vaxxers are like Holocaust deniers.
As another example, students are asked to judge the size of the urban heat island effect, again without providing much context. The world has warmed by some 0.8K on average since the start of the industrial revolution. Cities are 2-4K warmer than the surrounding countryside. Is that large or small? Cities occupy only a tiny fraction of the planet's surface, so if area is your frame of reference, the urban heat island effect is not that important. On the other hand, more than half of all people live in cities, so if population is your frame of reference, the urban heat island effect is a lot bigger than greenhouse warming. And, as thermometers tend to be where people are, the urban heat island effect is quite an important factor when homogenizing temperature records. Such nuances, so common in a university education, are absent in Denial101x.
I guess this comes as no surprise as the team behind Denial101x is the team behind SkepticalScience, that is, a bunch of none-too-bright rabid environmentalists.
It is a surprise, at least to me, that the University of Queensland and edX lend their name to this. I inquired with edX, and they confirm that quality control is not their thing. Indeed, they are not even bothered when lecturers seem unsure about their qualifications and employers. The course leader does not have a PhD, and in the first two weeks we saw a climatologist lecture in psychology and a chemist lecture in climatology. A bad advertisement for MOOCs.
As an aside, after my course on Climate Economics, one of the students asked me where I stood on the politics of it all.
Production values are high. The videos are slick, and well-integrated with surveys, discussion forums and quizzes to test how well you understood the material.
The contents are poor. The title is a start. "Denial" is a deliberate provocation, used to reinforce the tribal identities that polarize the climate debate.
The opening survey was full of leading questions, and the accompanying quizzes are not much better. As an example, students are asked to pass judgement on parents who choose against vaccination. Little information is giving about the situation. Parents may opt out of vaccination because they are confused about its pros and cons, or because vaccination conflicts with their religious beliefs, or because they have superior knowledge of their child's medical condition. Who knows? However, students of this course are led to believe that anti-vaxxers are like Holocaust deniers.
As another example, students are asked to judge the size of the urban heat island effect, again without providing much context. The world has warmed by some 0.8K on average since the start of the industrial revolution. Cities are 2-4K warmer than the surrounding countryside. Is that large or small? Cities occupy only a tiny fraction of the planet's surface, so if area is your frame of reference, the urban heat island effect is not that important. On the other hand, more than half of all people live in cities, so if population is your frame of reference, the urban heat island effect is a lot bigger than greenhouse warming. And, as thermometers tend to be where people are, the urban heat island effect is quite an important factor when homogenizing temperature records. Such nuances, so common in a university education, are absent in Denial101x.
I guess this comes as no surprise as the team behind Denial101x is the team behind SkepticalScience, that is, a bunch of none-too-bright rabid environmentalists.
It is a surprise, at least to me, that the University of Queensland and edX lend their name to this. I inquired with edX, and they confirm that quality control is not their thing. Indeed, they are not even bothered when lecturers seem unsure about their qualifications and employers. The course leader does not have a PhD, and in the first two weeks we saw a climatologist lecture in psychology and a chemist lecture in climatology. A bad advertisement for MOOCs.
As an aside, after my course on Climate Economics, one of the students asked me where I stood on the politics of it all.
A full reconstruction of Cook's 97% nonsensus is still lacking. However, Sou of Bundanga may have unraveled one further mystery.
In the data that Cook made available, abstract IDs run from 1 to 12,876.
The paper says that 12,465 abstracts were downloaded, of which 11,944
were used.* So, 411 abstracts are unaccounted for. 411 is a large number
relative to the number of papers that drive the alleged consensus, but
not knowing why the 411 were missing, I only included a puzzled footnote
in my Energy Policy paper.
Here is an explanation. Cook downloaded the abstracts from the Web of
Science in two batches. The first batch was the largest. After they had
rated all that, there were a good few more recent publications, so a second batch of abstracts was downloaded.
There was overlap between the first and second batch, and 411 duplicates were removed. So far, so uncontroversial.
However, if Sou is to be believed, duplicates were removed from the
FIRST batch, already rated, rather than from the second batch.
The missing abstracts are indeed disproportionally concentrated among
the lower IDs, which is consistent as the default data dump from the Web
of Science presents the more recent papers first, and more recent
papers are much more likely to overlap in Cook's two data dumps.
By removing already rated abstracts, Cook created more work and denied an opportunity to test data quality.
UPDATE
Sou offers an alternative explanation. Apparently, Cook queried WoS and downloaded the data in chunks. Some chunks were downloaded twice, or perhaps they were uploaded twice into Cook's database. For this explanation to work, we would have to believe that the data chunks were as small as 342 abstracts, or even 63 abstracts, or maybe even 4. Recall that Cook had 12,000+ abstracts.
Alternatively, Cook may have split his query, e.g. by discipline. This would lead to sizable data chunks, but the pattern of overlap would be random. Cook's overlaps are concentrated: According to Sou, the missing IDs are:
- IDs 5 to 346 inclusive = 342
- IDs 1001 to 1004 inclusive = 4
- IDs 2066 to 2128 inclusive. = 63
- Total = 409 - the other two are probably isolated somewhere.
I find all this implausible. If true, it would explain why my query returns 13,431 papers, but Cook has only 12,465 papers in his data: Some data chunks were downloaded but not uploaded.
Add a comment