Category Archives: General

Why UCF should allow faculty and staff to change Windows 10 taskbar display settings

June 21, 2017

My bid to get University of Central Florida’s (UCF) I.T. department to allow education faculty and staff to change taskbar settings so they could ungroup Windows 10 taskbar items and be able to display labels in addition to icons was shot down. I am told this issue does not affect job performance in any way and that there is no need for changes because work is not being impeded. My concluding remarks:

Thanks, [redacted], for your help! I disagree with [redacted]—faculty in the education department are provided with dual monitors, even though by this standard, single monitors would not impede work. I believe that like dual monitors, being able to ungroup items on the taskbar and being able to display labels instead of icons would improve productivity. However, I will take no further action.

EME 6646 Assignment on Measuring Creativity, Neuroimaging, Psychometrics, and Methods

Assignment 5, Part A: Individual Explanation of Imagination and Creativity
For EME 6646: Learning, Instructional Design, and Cognitive Neuroscience
By Richard Thripp
University of Central Florida
June 15, 2017

Measuring Creativity: Neuroimaging or Psychometrics?

When researchers using neuroimaging techniques seek to compare brain activity between people who are especially creative and people who are of average creativity, how do they do so? One might think this would be accomplished by using neuroimaging techniques to determine who is more creative. However, the pretty pictures of brain activity we see in many journal articles are actually the result of averaging and subtraction (Sawyer, 2011). In truth, most of the brain is active almost all the time—what we are really looking at is whether particular regions are comparatively less or more active than others, and this difference is often only 3% if we are lucky (Sawyer, 2011). Brain scans where certain “creative” regions of the brain are shown in bright red may lead the reader astray, not suggesting such a tiny differential in brain activity.

Perhaps because our current ability to measure actual brain activity is not a useful indicator of creativity, neuroimaging cannot yet be directly used to determine an individual’s level of creativity. Thus, even studies employing neuroimaging typically fall back on psychometric measures. For example, Jaušovec (2000) empirical investigation is titled “Differences in cognitive processes between gifted, intelligent, creative, and average individuals while solving complex problems: An EEG study” (p. 213). At first glance, one might think electroencephalogram (EEG) is being used to determine whether someone fits into the four categories of “gifted,” “intelligent,” “creative,” or “average.” However, Jaušovec actually used the Weschler Adult Intelligence Scale (WAIS or “IQ test”) and the Torrance Test of Creative Thinking (TTCT) to organize participants into these categories, defining “gifted” as doing well on both tests, “average” as not doing well on either, and the other categories as doing well on one test but not the other. Then, he found minor differences in EEG readings when participants solved open- or closed-problem tasks, and concluded that intelligence and creativity are probably different, and that patterns of brain activity are related to creativity and intelligence. Knowing that even the best psychometric tests have substantial measurement error (e.g., IQ tests measure not only intelligence, but familiarity with written language and academic environments), that grouping people as Jaušovec (2000) did introduces further error (I have reproduced his grouping table below), and that EEG itself lacks spatial resolution, Jaušovec’s methods seem so muddy as to be unfit to produce any conclusions. However, it is not as though I have cherry-picked an unknown, dubious study—according to Google Scholar his article has an impressive 239 citations! With recent arguments further suggesting that EEG’s temporal resolution is overblown (Burle et al., 2015), our confidence ability to draw conclusions diminishes further.

Jausovec (2000) Table 4

Figure 1. Grouping table for intelligence and creativity categories by Jaušovec (2000).

While EEG is not in the same vein of neuroimaging as magnetic-resonance imaging (MRI), near-infrared spectroscopy (NIRS), or positron emission tomography (PET), the use of psychometrics as an organizing device, and of subtractive averaging as a method to present pretty pictures implying big results, remain applicable. I have difficulty seeing the ethical differences between subtractive averaging and removing the zero axis on a bar chart to show bars of vastly different heights that would otherwise have been only slightly different.

Neuroimaging and Psychometrics in Creativity Research: A Corroboration Model

Psychometrics, the science of mental measurement, by definition is messy and imprecise. However, corroborating psychometric instruments with neuroimaging techniques may help us more accurately understand creativity. This is what Arden, Chavez, Grazioplene, and Jung (2010) advocate in their literature review and position piece on neuroimaging creativity. Researchers are all using different criteria to measure and interpret creativity, but there has been no concerted effort toward detailing the “psychometric properties of creative cognition” (Arden et al., 2010, p. 152), which is needed to be able to compare studies to each other. Nevertheless, employing neuroimaging has already allowed us to debunk, or at least fail to find support for, common hypotheses such as creativity being linked to the right brain or improved neural function (Arden et al., 2010). If we continue to improve the reliability and validity of creativity research along both psychometric and neuroimaging dimensions, we will improve our limited understanding of creativity, which is particularly needed areas such as novelty and originality (Fink, Benedek, Grabner, Staudt, & Neubauer, 2007). Limited spatial resolution prevents us from accurately isolating brain activity, while at the same time, the prevailing paradigm of neuroscience creativity research remains oriented toward finding the specific areas of the brain are associated with creativity (Arden et al., 2010; Sawyer, 2011), while the correct answer may be that all of them are—although some more so than others. Modern techniques as reviewed by Jung, Mead, Carrasco, and Flores (2013), such as structural magnetic-resonance imaging (sMRI), diffusion tensor imaging (DTI), and proton magnetic resonance spectroscopy (1H-MRS) are critical to isolating the structural characteristics of creative cognition, and might be seen as a complement, rather than a replacement, to the proxy measures that psychometrics constitute. Finally, lesion studies reveal that areas of the brain may actually compete in parallel to reach creative solutions, with the right medial prefrontal cortex (mPFC) winning out in healthy subjects, even though it produces inferior results (Jung et al., 2013). When corroborated with psychometric measures, this may lead us to an amusing finding whereby high creativity might be associated with brain problems (i.e., lesions in the left language errors).

Methodological Issues in Neuroscience-Based Creativity Research

Even recent creativity research is often devoid of neuroimaging. For example, Anderson, Potočnik, and Zhou’s (2014) “Innovation and creativity in organizations: A state-of-the-science review, prospective, commentary, and guiding framework,” published in Journal of Management and focused on 2002–2013 research, defines creativity as “idea generation” and looks at studies that solely use observational and self-report data. In an organizational context, it is still unheard of to use MRI, DTI, 1H-MRS, et cetera, and even EEG is rare. Moreover, the research corpus itself is scattered and disjointed (Batey & Furnham, 2006). Consequently, sound methods are even more important for the few researchers who are able to use neuroimaging methods.

A big issue exemplified in Jaušovec (2000), and reiterated by Arden et al. (2010), are case-control designs whereby subjects are unnecessarily dichotomized into high- and low-creativity buckets, instead of respecting the continuous nature of creativity. Even psychometric measures such as Torrance tests do not classify people in binary, but rather across a range of scores. Respecting this continuity can improve statistical power.

Using expensive and cumbersome technologies such as PET or fMRI requires lying down, perfectly still, with loud whirring noises (Sawyer, 2011). Even EEG requires electrodes attached to one’s head, which impairs many creative activities. Methodologically, this is a large problem that is presently not surmountable. There is no way to measure creativity with an fMRI while a subject plays a violin (except, perhaps, a pizzicato performance). Moreover, neuroimaging studies do not measure novelty or usefulness, unlike common definitions of creativity used by non-neuroscience researchers (Sawyer, 2011).

Lastly, although there are many other methodological issues, neuroscience creativity research would be furthered by accurate reporting and disclosure of averaging, subtraction techniques, and the actual activation levels that were observed temporally and/or spatially (Sawyer, 2011). Speculation about causation should be clearly marked as such. Finally, researchers should refrain from labeling a region of the brain as a center for any specific creative task, or for creativity in general (Arden et al., 2010). Even though it generates popular press, such determinations are typically inaccurate.


Anderson, N., Potočnik, K., & Zhou, J. (2014). Innovation and creativity in organizations: A state-of-the-science review, perspective, commentary, and guiding framework. Journal of Management, 40, 1297–1333.

Arden, R., Chavez, R. S., Grazioplene, R., & Jung, R. E. (2010). Neuroimaging creativity: A psychometric view. Behavioural Brain Research, 214, 143–156.

Batey, M., & Furnham, A. (2006). Creativity, intelligence, and personality: A critical review of the scattered literature. Genetic, Social, and General Psychology Monographs, 132, 355–429.

Burle, B., Spieser, L., Roger, C., Casini, L., Hasbroucq, T., & Vidal, F. (2015). Spatial and temporal resolutions of EEG: Is it really black and white? A scalp current density view. International Journal of Psychophysiology, 97, 210–220.

Fink, A., Benedek, M., Grabner, R. H., Staudt, B., & Neubauer, A. C. (2007). Creativity meets neuroscience: Experimental tasks for the neuroscientific study of creative thinking. Methods, 42, 68–76.

Jaušovec, N. (2000). Differences in cognitive processes between gifted, intelligent, creative, and average individuals while solving complex problems: An EEG study. Intelligence, 28, 213–237.

Jung, R. E., Mead, B. S., Carrasco, J., & Flores, R. A. (2013). The structure of creative cognition in the human brain. Frontiers in Human Neuroscience, 7, 1–13.

Sawyer, K. (2011). The cognitive neuroscience of creativity: A critical review. Creativity Research Journal, 23, 137–154.

Six Reasons Why Evernote is a Freaking Joke

Starting in summer 2016, I took to using Evernote to take notes at meetings and classes instead of pen and paper as I had for nearly a decade before (I have paper notes dating back to my first college courses in fall 2007). I take notes with a Logitech K810 Bluetooth keyboard connected wirelessly to my Samsung Galaxy S7 phone, using the Samsung Wireless Charger as a stand. I’m not sure why, but many people are surprised/impressed by this. Often, I recall my classes at University of Central Florida being filled with people on laptops or MacBooks, which seems like overkill to me and is probably quite a bit less productive than my setup (although I often devolve into transcribing verbatim due to being able to type 100+ WPM, the small size of a smartphone removes the “wall” between others and I, and discourages multi-tasking).

My use of Evernote is fairly basic: primarily for text-based notes. I regularly use only my smartphone and home desktop PC, so Evernote’s recent decision to limit syncing to two devices (and simultaneously raise the prices of paid subscriptions with no new features) did not affect me. I’ve never approached Evernote’s atrociously small 60 MB per month data limit. I find the PC client useful for its search functionality, and like being able to view, write, and update notes from my phone. I’ve recently expanded into using Evernote to take photos of whiteboards or handwritten notes where necessary; it does a nice job of correcting lighting issues and cropping + straightening.

Nevertheless, it is abundantly clear that Evernote, as a company and product, is a freaking joke. Here are just a few big reasons why:

1. Evernote notes literally cannot be printed. Customers have been complaining about this basic feature being broken for over SIX years, and Evernote doesn’t give a crap. If your note has any italics or bold text, the printing comes out wonky. Like many “support” forums, Evernote’s forums have descended into people blaming users for trying to use an app for something it’s not designed/intended to do. Some even say printing is stupid! Evernote employees actually advise not using bold or italics as a workaround. I kid you not.

2. Evernote sync just plain sucks. Their solution is to put conflicted items in a “Conflicting Changes” folder, and changes can “conflict” even if the user does everything right. Evernote gurus instruct users on the forums that they should be very careful to click the “Sync” button and wait for it to complete before attempting to edit a note on any other device. Basically, blame the user for something that other apps like Google Docs have figured out. As for Conflicting Changes, Evernote offers no “diff” feature to compare these conflicts. Even MediaWiki (the bungling, convoluted PHP disaster behind Wikipedia) offers diffs. This can’t be too hard to implement.

3. Evernote’s PR is a freaking trainwreck. First, they had the poorly executed 2-device limit in June 2016, coupled with a price increase that makes Evernote almost as expensive ($7.99 vs. $9.99 per month) as Adobe Photoshop, an infinitely more complex and per capita (by features), a less stupid program. (Admittedly, Adobe for some reason removed tabbed browsing in Acrobat X and XI and then their forum volunteers/employees enjoyed going off on customers about why they shouldn’t want/need tabbed browsing, and then there is Adobe Flash, but I digress.) Second, in December 2016 they announced a Christmas present: a new privacy policy that says they will read your notes whenever they feel like it. You can’t make this stuff up.

4. Evernote forum morons (I don’t actually participate in support forums but often read them or find them via Google) repeat the mantra about Evernote, in addition to NOT being a word processor or synchronization tool, is also not a collaboration tool. Anyone who finds issue with this is Just Plain Stupid™. Of course, then, I think, I must be pretty stupid. What the hell am I using this piece of crap for? The search and “sync” functions? (The latter should really be labeled “backup” because remember, you should only edit an Evernote note on one device at a time or Bad Things Will Happen, quite literally, because you’ll have to manually reconcile a “Conflicting Change” with a note that’s missing half of what you wrote.)

5. Evernote’s Android and iOS apps literally bog down if your note is over about 10,000 characters. Seriously, in a 3-hour lecture, I’ve had to make three separate “Part 1,” “Part 2,” and “Part 3” notes because the typing becomes slower until the words don’t appear until many seconds after I’ve typed them, making corrections and even moving the cursor agonizingly slow. This is for a PLAIN TEXT note on the Samsung Galaxy S7, literally the Greatest Smartphone that Doesn’t Seem to Have an Explosion Problem in the World™. Seriously, I’m pretty sure my phone became hot to the touch because of the Evernote Android app not being able to handle a 10 KB note. This is like something out of 1991 (the year of my birth). The PC client doesn’t have this problem, but this is simply ludicrous to begin with.

6. Evernote cannot save an empty note. What? Even Windows NotePad can save a text file with no text in it. On the PC client, I often end up titling a note first, before typing anything in it. I put a “.” in the note body to be able to save the note. After I write the note, in the viewing pane, the preview of the note forever remains “.”! There is literally no way to update the preview snippet. actually, this seems to be working now in v6.4.2.3788 of the PC client, but I’m pretty sure it wasn’t working before, and still, Evernote will never fix the printing issue for some reason and apparently believes they are treating their users and customers a lesson (yes, the paid version can’t even print).

As you can see, I am still using Evernote but I did, in addition to these grievances, suffer data loss once where I lost part of a note when switching from PC to phone, but this is Probably My Fault Anyway™ and I also should not feel so entitled as to complain about a free product that I am choosing to use of my own free will. Google can’t even get search to work properly in Google Calendar, and one commentator says it doesn’t matter because Google doesn’t make ad revenue off Google Calendar anyway, and another says I should be grateful to even have a Google Calendar. So, in the spirit of Christmas I am grateful for my Google Calendar and this freaking joke we call Evernote. (Though, in a year, it will probably be pretty embarrassing to look back and see I was still using Evernote at the end of 2016.)

New Job as UCF FCSUA Web Developer

I have started working as a web developer for University of Central Florida’s new Florida Center for Students with Unique Abilities. Check out the progress Dr. Rebecca Hines (sister of Cheryl Hines!) and I have made on the site so far:

The Florida Senate established this new program which will serve the whole state from its headquarters at University of Central Florida in Orlando:

Working on WordPress with GoDaddy IIS hosting (chosen by the previous developer) has been awful. GoDaddy is so much buggier and slower than my host, SYN Hosting. I’m hoping to get approval to switch soon, though we’ll lose money on our GoDaddy subscription.

Statement on Orlando Nightclub Shooting, 6/12/2016

In the Orlando nightclub mass shooting on 6/12/2016, it is puzzling that the police handled an alleged “hostage situation” by waiting outside for 3 hours while 50 people were murdered and 53 more were injured. Surely the police should have heard the shots and screams and intervened earlier? I cannot fathom what sort of “hostage situation” would be handled by allowing the alleged lone gunman to injure or kill half of the approximately 200 people in the nightclub, over a period of 3 hours (2:02–5:00 a.m.).

Source: ORLANDO SENTINEL: Orlando nightclub gunman called 911 before attack, pledged allegiance to Islamic State [local mirror]