Tag Archives: technology

Don’t tell me not to learn!

I mentioned at the start of the year that I was doing CodeYear. You may be wondering how that is going. Still going strong! After about 5 weeks I cobbled together a little DNA-translator; a few weeks later I finished the entire JavaScript section.
We’re now in html/css lessons, but I already know most of that, so I’m not learning much right now. (I did learn one important new thing, though. I found out why so many websites look the same these days. Twitter Bootstrap! Aha! Oh Twitter, how far your influence stretches…)
So, it’s still fun, and I’m still learning things.
I actually tried to teach myself some coding (Python) a few years ago, but had to admit defeat – something I don’t easily do. I bought books and everything. I thought I’d be okay because I did take some classes at university. But even with the beginner books I was stuck. Why? I didn’t know what to use to actually type the code in, compile, run – all that stuff. I could write code in a text editor…and then…what? I had nothing to work in.
CodeYear is web-based, so you type in the browser, and now I can finally play around with things. When I made the DNA translator, however, I still had to google a bit to find out how to actually put javascript code into an html file so that I could display it on my own site. (I’m guessing they will teach this at some point, but we hadn’t covered it yet in week 5.)
I mention this to emphasize the difference between Learning to Code for Fun (which I’m doing) and Learning to Code for Serious (which involves knowing exactly what platforms to run your code on before you even start to learn the language, and not typing in browsers).
It’s rather like the difference between learning science from watching a lot of science documentaries and visiting the science museum, versus learning science in actual labs at actual universities. If you want to work in science, you do the latter. If you’re just interested and want to know more, you do the former.
If there were no documentaries and science museums, someone interested in science (but not professionally) would have to get their hands on university textbooks or journal papers and just jump in the deep end. That’s how I felt when I tried to learn to code a few years ago and didn’t even know what program to write the code in.
So as a geek-of-all-trades who likes learning more about everything, I’m happy that there is a site that lets me play around and learn things, just like I can learn more about geology by visiting a museum or watching documentaries or looking at rock formations while on vacation. And if I don’t want to code or don’t want to learn about geology, I don’t have to do those things. Nobody is forcing me. Nobody is forced to learn anything about science after the age of about 15, and yet there are lots of people visiting science museums and watching science programming to learn more about cell biology or physics or geology.
Wouldn’t it be weird if geologists got upset that random people wanted to learn more about geology? If they wanted geologists to be the only people to study rocks? That is apparently how some programmers feel about coding outreach projects. I read this blog post yesterday, and even commented, but it’s still bothering me. Today I realized why:
I should be allowed to learn ANYTHING I WANT.
EVERYONE should be allowed to learn ANYTHING THEY WANT.
I love when people do science experiments on their own, and I have never met a scientist who was opposed to the concept of amateur scientists. We don’t always take them seriously, but surely anyone can do science if they want to! What is this ridiculous elitist attitude of stating that non-programmers shouldn’t code?
You’re only making me want LEARN HARDER.

I wrote a program

Earlier this year I shared that I had started doing Codeyear. After a few weeks, I realized I now knew enough to write a very basic program that translates DNA to protein. It took me a few days, and it’s super clunky, but it works!
Try entering “atggaatcatcggccggggag” to find a message. :) (Hehe. Lame, I know.)
When you type in a string of A, C, G, and T it finds the first ATG to start from, and then translates codons until it hits a stop codon or the end of the sequence.
Nothing very groundbreaking (it doesn’t even understand upstream sequences!), but I’m excited that I got it to work! I learned C in undergrad, but never used it after the one course, and only ever did html/css for websites since then. So this is my first “useful” program.

Can technology protect against fraud?

Even though I’ve left the lab a few years ago, I’m still interested in what goes on in labs. Perhaps even more than before, because I can now take a step back to think about things in a broader sense. Take lab notebooks, for example.
There was a news article in Nature this week about going digital in the lab. It inspired me to ask the readers of the Node (who are predominantly working scientists) whether they’d consider a digital lab notebook (you can take the poll there) and was actually surprised how popular it seems to be. Clearly the past few years have made a big difference, tech-wise, because I think the answers would have looked very differently in 2008, when I last wrote in a lab notebook. Perhaps it’s an effect of the rise of tablets, which make technology far more portable and easier to handle in a lab where you walk around all day.

My paper-centric workspace in the lab, in 2005.

I’m happy to see technology and “wet lab” moving closer, but there are still some engrained cultural differences between the two. There are a few interesting comments on the Nature feature that should definitely not be overlooked by eager tech companies ready to push for a more digital culture in the lab. For example, Cynthia Bristow’s comment about accountability is something that’s key to certain fields of research.
If you want an entertaining example of the role of handwritten notes in fraud detection, read the novel Intuition by Allegra Goodman. In more realistic, real life examples, some supervisors sign off lab notebooks by their lab members, and that is in fact why they have the notebooks – not just to keep track of things, but to account for their work. It shows years down the line – when you write the paper, or even after that – who made which notes on which date, and when the supervisor saw it.
Is there a control equivalent in digital lab notebooks that can check with absolute certainty which person made which notes and when they were approved? There might be, but I don’t see that emphasized as a key point of digital lab notebooks. The emphasis is always on ease of finding information, tracking projects, or planning experiments – but the tools are offered as a replacement for something of which one of the key functions is security and accountability.
Are the next cases of lab fraud going to involve hacking into lab notebooks?

Scientists and musicians on Twitter

Not my project, but David Bradley has compiled a Twitter list of scientists who are also musicians (or vice versa). There are now 35 people on the Twitter list, and the first 25 are also on his blog.
A few people on that list I already knew of, or had jotted down their name for my own project, so I’m keeping an eye on this.

(And if you’re feeling really bored, you can also follow me on Twitter. I occasionally get sick of it and delete all my Tweets and stay off it for a few weeks, but I’m using it at the moment. It’s more music than science lately, and a lot of moving woes. I’m moving to the UK in three weeks, to start a new job there, so selling my furniture is my current main project. Anyone want a coffee table?)