Adam Neely on “a personal connection between artist/creator and fan”

September 17, 2018 Leave a comment

Bassist Adam Neely has one of the best music education channels on youtube. In this interview by Patrick Hunter, Adam said something at the 15:25 mark that I thought were sufficiently profound that I wanted to transcribe it (I’ve paraphrased slightly):

People are very much caught in the old idea that the recorded music that they make is worth something. I believe it should be worth something, but I don’t think it is to anybody else… so, in order to survive (not judging, just what we need to do), it has to be a lot more about the musician as a personality or the musician as a brand. Somebody will connect with your music because of who you are and how you’re presenting your music, rather than just listening to the music itself. Of course, people are like, “oh, it should be all just about the music, and you should judge it based on that.” But it’s much more about having a personal connection between artist/creator and fan. It used to not be that way; there was a middle man. There was the record industry, and the record industry served a function in terms of distribution. Now that distribution has been shifted to tech companies, so there’s a much different model between the musician and the person listening. There’s the tech company intermediating, but the tech company says, “alright, you need to tell a story with yourself and be a personality and engage people that way, and that’s the way that we’re going to be able to monetize what it is that you do.” So musicians can do this, through youtube, Patreon, sponsorships — there’s all different kinds of ways you can monetize yourself. Touring, a little bit. But that’s definitely going to be the wave of the future, and we’re weirdly at the cusp of it…

We all have to me polymaths… We all have to be everything. We have to be Leonardo DaVinci now. But I don’t think that’s a bad thing. That’s just another way of thinking about what it is that we do, and the sooner that you embrace that fact, the better.

Categories: Uncategorized

“The present is the least interesting time to live in.”

January 15, 2018 Leave a comment

“Have a glimmer of an idea; take it out 30 years where there is no possibility of worrying about ‘how am I going to get from where I am now to this idea?” That is the idea killer of all time: ‘how is this incremental to the present?’ The present is the least interesting time to live in.” — Alan Kay (at the 41 min 20 sec mark)

Alan Kay, 2015: Power of Simplicity

Categories: Uncategorized

My first CS class started with Turing machines

January 5, 2018 1 comment

My freshman year in Washington University in St. Louis began in the Fall of 1989. I took a class called CS135. It was required of CS majors, as well as majors in other fields that had an introductory CS course requirements. I have no recollection of what the class was actually called, and alas cannot find any references to it on the web.

The class was roughly divided into three equal parts: 1) Turing machines, 2) assembly language, and 3) Pascal.

The lab computers were early MS-DOS PCs of some sort, with a couple of 5 1/4″ disk drives. There were five per table, linked to a dot matrix printer via hardware switch.

Seriously, the class started with Turing machines. We were provided a Turing machine simulator, where we’d convert our state diagrams into a table, enter the input tape, and watch the states change and the Turing machine’s read/write head move back and forth and twiddle the symbols on the tape.

The second section of the class used a fake assembly language called SNORE. You’d type in your program and see which instruction was being executed and watch the registers change. Alas, I can’t find any information on SNORE on the web; this may partially be the result of the word SNORE (like Processing) naturally getting a lot of Google hits that have nothing to do with programming.

The final third of the class was Pascal, and was much more along the lines of what most intro CS classes at others schools were probably like at the time. (Pascal was all the rage in the 80s, largely because Philippe Kahn got the idea to sell Turbo Pascal for $50, when compilers from companies such as Microsoft cost hundreds of dollars, were slower, and overall provided a much less pleasant user experience.)

The second class, CS236, was entirely in Pascal, and covered more advanced topics like pointers. I tested out of CS236, and went straight to taking CS301, which was a typical “discrete math for CS folks” class. (I had never formally studied pointers, but after a few examples it was easy enough to figure out what they did and be able to step through and hand scratch a few programs using them.)

Looking back on CS135, almost three decades later, and almost two decades into teaching, I wonder what the designers of the class were thinking. I don’t particularly recall what my feelings were as a student — I probably I thought the Turing machines were interesting puzzles but wanted to get on with “real” programming. Of course, I see that the designers were going for a more CS-flavored version of the Patt & Patel Introduction to Computing Systems: From Bits and Gates to C and Beyond curriculum, starting at a low level and then building increasing layers of abstraction. Except Turing machines are of theoretical interest; no actual practical computer directly uses a Turing machine model of computation as its core. I can’t imagine that the Turing machines felt exciting and motivating to anyone who wasn’t already highly inclined towards CS — and there ware quite a few non-CS majors in the class.

In any case, CS135 was short lived, soon to be replaced by CS101, which was based on Scheme — but that’s the topic for another post.

So, dear readers: did any of you have an unusual introductory CS course? What do you think about starting out with Turing machines?

Categories: Uncategorized

Inbox Infinite

It’s May 9th, and I struggle under the weight 2389 unread messages, out of 5694 total, in my Georgia Tech e-mail inbox. The earliest unread message is from March 10th.

I’ve barely paid any attention to my e-mail at all for the past four weeks. There’s been stretches of three, four, or even five days where I haven’t looked at e-mail at all, and when I did, I either skimmed subject headings as I gave my iPhone screen one solid flick, or I searched specifically for a subject keyword I told my students to include.

The Culprits

Although I certainly get behind on e-mail here and there — just like everyone else — I’ve been unusually bad at it lately. I suspect that five things conspired:

1) Georgia Tech added two-factor authentication in order to log into the VPN. A while back they added a requirement that faculty be logged into the VPN to access e-mail through something like Apple Mail. Having to take the extra 30 seconds to log into the VPN was annoying, but not too bad. But requiring two-factor added a lot of annoyance to the process. I have to start a specific app on my phone called Duo, and hit the approve button in a limited amount of time for my laptop to connect to the VPN. I don’t always keep my phone with me when I’m working at the house, so to check my e-mail, I have to get up and hunt for my phone. I realize this seems like a small annoyance, but the net effect is it encourages me to put off checking e-mail. That said, I do see the wisdom of two-factor authentication; I’m pretty sure John Podesta wishes he had used it. (Of course, I can read and send e-mail on my phone, but I dislike reading any e-mail longer than two sentences on my phone, and I despise trying to compose e-mails on my phone).

2) The effective signal to noise ratio of e-mail is asymptotically approaching zero, and I’m not just referring to the battle against in spam. Even e-mails ostensibly of value — confirmations of a book order on Amazon, an announcement of a show by a band you love that doesn’t play out very often, etc. — merge into an e-cacophony. In the early 90s, few outside academia had email, and even when AOL’s dial-up service opened up e-mail to a wider audience, most of the e-mail you received came from people you knew and generally wanted to converse with. Even when companies started having websites to show their wares, you generally had to call and give them a credit card number the old-fashioned way, so most of your e-mail came from individuals, not companies. There was a time when starting your e-mail software (I used the unix “mail” throughout school, switching to the more sophisticated “pine” when I started my postdoc; I switched from “vi” to “emacs” at the same time) sometimes induced tingly anticipation. Now, starting an e-mail client generally fills people with a combination of disgust and dread.

This is analogous to your U.S. Postal Service mailbox. How often do you receive letters in your physical mailbox that you truly desire? Nowadays it’s pretty much bills and advertisements. As e-mail has gone the same way, communication with people you want to communicate with has moved to other venues like Facebook Messenger. I confess I’ve had colleagues ping me on Facebook to double-check that I received some e-mail. On one occasion, an associate editor of a journal messaged me on Twitter during a similar epoch when I was running so behind on e-mail that I was walking backwards. It seems like every new form of communication eventually gets jammed up with junk. People flee to a new medium, only to slowly realize that the junk followed them.

3) This semester was the maiden voyage of my senior-level special topics class “Guitar Amplification and Effects,” which included a laboratory component. Lab assignment are much more time consuming to create than pen-and-pencil homeworks. During the semester, I’d often run into road blocks while formulating a new lab, and I’d wind up deciding to put off the lab and just move on to the next bit of lecture material. Then, one day, I ran out of lecture material. The net effect was that all of the labs wound up scheduled near the end of the semester, and when I was waist deep in 400 volt power supplies and vacuum tubes, I wasn’t looking at e-mail.

4) I spent the Spring 2014 semester teaching at Georgia Tech Lorraine, our remote campus in Metz, France. Half way through that semester, I had a similar realization that I had fallen epically behind on e-mail. It occurred to me that I had previously developed a habit of using my phone to “clear out’’ obviously unneeded e-mail whenever I had a spare moment standing in line somewhere, so when I finally sat down at my laptop and pulled up my e-mail program, I had a reasonable set of e-mails to respond to. When we were in France, my wife and I bought cheap prepaid SIM cards with just voice and text messaging, so I didn’t have internet access on my cell phone. Hence, when I started up my mail client, I’d see a hundred messages instead of a few dozen, and I’d just shut it down and go on with whatever else I was working on.

Of course, I have internet access on my cell phone now, but I’m generally trying to spend less time fiddling with my phone and more time paying attention the world around me.

5) In past eras when I’ve found myself behind on e-mail, I’ve avoided starting my e-mail client out of a sense of shame. But I recently discovered that I’m just overall happier on those days when I don’t look at e-mail at all. I’m overall more productive when I dive into the deepest end of whatever I’m working on.

Taming the e-mail beast

In spite of my realization in point (5) above, avoiding e-mail entirely is unsustainable. I clearly need to get my inbox under control, and I need to set up some specific tools to do so. There’s always a tradeoff with such schemes. For instance, if you have to convert some data files from one format to another, you have to weigh the effort required to script such operations vs. the effort required to do the conversions “manually.” If you only have a few files to convert, it may be more time efficient to just do it the brute force way. But if you have a ton of them to convert, the time invested in scripting the operations reclassifies itself as time wisely spent.

The most obvious first step is to make a folder specifically for messages from the two heavy-traffic mailing lists I’m on: Synth DIY and Analogue Heaven. It seems that such e-mails already in my inbox need to be moved manually — doing so brought me down to 4268 messages, 1136 unread.

As a first cut, I created a “Smart Mailbox” called “GaTech Senders,” with the rule “from contains” Smart Mailboxes don’t actually move e-mails out of your inbox; they’re basically search presets. This should allow me to quickly see e-mails from students, faculty, and staff. Some students and faculty use gmail instead of their official GaTech e-mail, so this would miss those, but I figure if any come up that are important I can add them to the rules somehow. (This might be tricky, since without digging in and mucking about with XML files, Apple Mail doesn’t have a way to mix “any” and “all” operations — i.e., you get logical “and” or logical “or,” but can’t create nested combinations of both without cluttering things up with “helper” mailboxes.) “GaTech Senders” shows 2148 messages, with 307 unread.

What I really want is to distinguish between e-mails that were specifically composed for me (or me and a small group of specific people) and broad announcements. If my School Chair or my Dean or someone in their administration writes a note to me as Aaron, I want that to pop up as higher priority than an e-mail announcing that a professor or student received an award. (This isn’t to say that the broad announcements aren’t important; they often contain changes to or clarifications of policies that I need to pay attention to.) Actually, that’s something that would be useful for all of my e-mail, not just e-mails from ramblin’ wrecks. So, I created a “To Me” mailbox with the rule “any recipient contains lanterma” (this covers my core eight-letter e-mail name as well as aaron.lanterman), as well as a “To Masses” smart mailbox with “any recipient does not contain lanterma”). “To Me” has 3145 messages, 809 unread, and “To Masses” has “1201 messages, 327 unread.”

Actually, “To Me” in its raw form misses a few; there’s another e-mail address for me that consists of “al” followed by 3 randomly chosen digits. There’s no way to add that to the “To Me” rules without giving up the inbox-only rule, because you have set the rule combination operation to “any” or “all.” But, I get very little mail at that address; almost everything I receive there is an automated message from our IT department, telling me that my password is expiring or providing a summary of what fell into a spam trap. So I’ll make a separate smart folder for that weird e-mail address, and add an “any recipient does not contain” line to cut it out of the “To Masses” list.

Now I can make two new smart mailboxes, “GaTech To Me” and “GaTech To Masses,” that contain appropriate intersections. “GaTech To Me” contains 1174 messages, 67 unread, and “GaTech To Masses” contains 974 messages, 240 unread.

It looks like “GaTech To Me” contains a lot of “false positives.” For instance, event announcements from Georgia Tech Professional Education, the School of Music, and the Center for Teaching and Learning appear with a “To:” field, so they appear in “GaTech To Me” even though they were broadcast to a wide list. So my new strategy still needs work, but it’s a start.

Categories: Uncategorized

Observations about Coursera’s “Fundamentals of EE,” by Don Johnson

October 12, 2014 Leave a comment

Around January of 2014, I finished watching all the lectures for Don Johnson’s Fundamentals of Electrical Engineering on Coursera. While sorting through e-mail from around that time, I came across some observations I sent to a few colleagues, which I refined to share here.

1) As far as I can tell, Don’s course is unique in the set of ECE intro courses. There’s no class at another university that I can directly compare it to. It’s sort of a cross between UIUC’s Analog Signal Processing class (a circuits/signals-and-systems hybrid) and Georgia Tech’s Introduction to Signal Processing class (which focuses on discrete-time signals-and-systems), with information theory thrown into the mix. Don compares analog and digital communication schemes in the context of channel capacity. The course covers both analog and digital processing, with the focus on signals as carriers of information. The scope of the class is breathtaking; the last time I looked at a class and got a similar mind-blowing impression of its depth was MIT’s old Structure and Interpretation of Computer Programs.

It’s utterly brilliant. (But, I have to be careful not to equate “something a professor thinks is really interesting” with “something that will gel with undergraduates.” These are not always overlapping sets.)

Don includes a lot of material on Fourier series and transforms and frequency responses, but he doesn’t include anything on Laplace transforms. The Laplace omission makes sense since there’s not a lot of emphasis on generic switched-voltage-source, capacitor charging/discharging examples that typically eat up a lot of the time in most traditional sophomore circuits courses. Don goes into the frequency domain quite early and generally remains there through the rest of the course.

The uniqueness of the course, to me, provides one of the strongest arguments in favor of MOOCs. If you want a standard sophomore circuits course, nearly every EE department offers one, and frankly, there won’t be much difference between the one at Georgia Tech (ECE2040) and its equivalents at Southern Poly. But before this MOOC, the only way to see this material assembled in this particular fashion with this particular vision would be to move to Houston.

2) The sound quality is utterly horrible, and it’s a revelation to me how much that effects my overall perception of the class. The fan noise from Don’s computer is quite evident, and although I don’t know for sure, it sounds like he’s using the built in mic on his computer. From lecture to lecture, or sometimes in the middle of the same lecture, the sound quality will suddenly change, as if someone was experimenting with different levels and noise reduction settings in some audio editing software package. I was alternating between Don’s course and some of Udacity’s courses, and the higher production quality of Udacity’s products is striking.

Magnus Egerstedt’s Coursera course on Control of Mobile Robots (basically a graduate linear systems theory course like Georgia Tech’s ECE6550, with neat material on robots added in) is on Coursera, but it was taped in the fancy recording studios at Tech, so the audio for his course is much better. That said, although the audio in Magnus’s course was mostly noise free, it was encoded at a relatively low level. I like to watch lecture videos while riding on the stationary bike, and with Don’s and Magnus’s courses, that was hard to do, even with my laptop volume control maxed out; but with Udacity’s courses, the sound was loud enough (perhaps they used professional limiting program like the Waves L1?) that it overpowered the sound of the bike.

I can’t remember who said this, but someone once noted that the main key to professional looking video is professional sounding audio.

3) The way the in-lecture quiz questions were handled was absolutely maddening. On Coursera, an introduction to the upcoming quiz question is embedded in part of the preceding lecture segment. In Don’s course they were clearly thrown in after the fact; sometimes, it almost feels like they interrupt what he’s saying mid-sentence. They’re jarring. It would sometimes take me a second or two to realize I was being quizzed, and the audio hadn’t stopped because of an internet slow down. I didn’t feel like they kept me engaged, the way the Coursera quizzes did; I felt like they interrupted the flow, and it was hard to get the vibe back after the sudden interruption. The most bothersome questions were the ones that were thrown in to correct errors made in the main presentation. They’d start with a phrase like “The instructor made an error when writing the node-voltage equations. What should the second equation have been…” Sometimes I wanted to yell at the screen. It sounds like a small, petty detail, but it’s interesting how many small details add up to create a perception, good or bad, of the experience.

4) While Don put together this course brilliantly, I’m not sure Don is the best person to present it in this format. He sometimes tries really hard to sound really excited, and he’s clearly putting in a massive amount of effort, but his voice could send a dozen kittens in Consumer Report’s laser pointer test center into the deepest slumber. He tends to trail off at the end of sentences, so sometimes the start of the sentence is above the threshold of the fan noise while the end of the sentence starts to dip below it. I realize some of that awkwardness probably stems from the unnaturalness of having to talk to an empty room, which I find to be tremendously difficult.

I should note that my comments above apply to the first Coursera offering of the class; they may have made improvements in newer offerings.

Categories: Uncategorized

The tyranny of semesters & the trouble with Coursera

October 3, 2014 1 comment

Throughout numerous news reports and blog posts, comments on those news reports and blog posts, and e-mail discussions prompted by them, many legitimate criticisms of methods of teaching and learning outside of the usual on-campus class structure have been raised. But those usual on-campus classes have their own limitations; we are just so accustomed to working around those limitations that they’re barely noticed. Why should these same limitations be mapped into the online space? Why do we keep putting horseshoes on our automobiles?

This leads us to the problem with Coursera: it has the word “course” in its name. I’m not saying the courses themselves are bad — the quality on Coursera varies wildly, but many of them are quite good, and I hope to do a “course” on Coursera at some point — it’s just that the very concept of a “course” is artificial.

Courses, along with time units like semesters and trimesters and quarters, are organizational artifacts borne of the practical need to allocate chunks of time associated with chunks of physical space and chunks of biology called “students” and “professors” and get them to line up in some way those chunks of biology can readily remember, like meeting TuTh at 2-3:30 or MWF 1-2, staring on a certain date and ending at a certain date. Are any of those optimal in any way? Can anyone tell me if there’s any research on whether three days a week for 50 minutes is better or worse for learning than two days a week TuTh? Maybe there’s some material that’s best learned MTuW of WThF. Can anyone tell me? Has anyone even asked?

What about time of day? I have heard rumors of the existence of “morning people,” and there are probably some faculty and students who fit in that category. But for students who are not in that category, I conjecture that an 2:30 PM class is going to be a hell of a lot better for learning than an 8:30 AM class. Has anyone studied that? In all the discussion about problem-based learning and flipped classrooms and clickers or whatever, if someone could show that simply not having class at 8:30 AM resulted in massive improvement in learning outcomes, would we change our scheduling to accommodate that finding? Or would we not even bother to ask the question, given how limited we are on classroom space, and just implicitly state — whether we mean to or not — that we believe that the material taught at 8:30 AM is less import than the material taught at 2:30 PM.

This connects to Lanterman’s Temporal Maxim of Education: Any method of online course delivery is superior to an in-person class that meets at eight-assclock in the morning.

Dear university professors reading this post: Did you stop learning after you finished your PhD? If not, how many things have you learned in the past decade that had specific start and end dates?

Categories: Uncategorized

“If everyone could make a living doing what they love to do, then what could be better?”

While watching the closing comments from the “Future of Education Panel” at Maker Faire 2011, I resonated with the answers to the question, “What is your hope for the future of education and technology?”

Mitch Altman: “I would love to see a world where everyone can truly explore what it is that they love to do. And if everyone could make a living doing what they love to do, then what could be better? That would mean we have a world full of many more people living fulfilling lives. And if we had a world where people could be encouraged to explore who they are and do what they love, then our education system would be tops.

Ben Heckendorn: “Expounding from that point, I grew up in a small town… think if it was 100 a years ago, what I do for a living, assuming they had graphics artists 100 years ago, `Mary’s Cow Powder’ advertisements in newspaper… you would never get out of your small town. You’re stuck there. You can do what you can do in that element, but that’s the limit of your scope. But the internet has opened up the world to everyone. Anything in the world is no further away than your computer. That is a great resource, and we’ve only started to scratch the surface of its potential.”

Jeri Ellsworth: “I’d encourage everyone to go out and become a mentor. I try to help out as much as I can.”

In her closing comments, the moderator Michelle Dobson said: “It’s time that we knock down the four walls of the classroom… The world is now the classroom… students will learn when they can be engaged in their learning; have teachers as facilitators of knowledge, not as the givers of knowledge, the holders of knowledge. We need to have authentic learning opportunities for students to succeed.

I was particularly moved by Altman’s vision: “I would love to see a world where everyone can truly explore what it is that they love to do. And if everyone could make a living doing what they love to do, then what could be better?” That, right there, is what I want my role as a researcher and an educator to be — to help make that world a reality.

Categories: Uncategorized