Name: Maggie McFee
Job Title: Manager of Computing Services/Systems Administration (Physics & Research Computing)
How long have you worked for RC?
That question is harder to answer than it seems. I've been a sort of liaison between Physics and RC for some time and tried to help out where I could, but I only officially began working in RC (50% of my time) this past July.
What led you to a career in HPC?
I've worked around high performance computing in various capacities since the mid-90s when I was with Intel's Performance Microprocessor Division. I worked in everything from front-line support of engineers and clusters to process creation and management within the support organization. I spent a few years after that in Australia with my then partner and got a taste for working in an academic setting. So the first place I applied upon returning to the US and coming to Boston was Harvard. I've been running the Physics department's computing, along with Milan Kupcevic who's also at RC now, since 2005. As the computing landscape changes at Physics and throughout the FAS, moving in to RC seems the obvious choice for the future.
What’s the best part of your job?
I really, really like the people in RC. Everyone is nice, helpful and... honestly, funny. And, on top of that, everyone's top-shelf at what they do. Also, I got to make Odybot. I mean, how many jobs ask you to build a toy robot for them?
What’s the hardest part of your job?
What we warmly refer to as 'rabbit holes'. The RC environment is very complex and has a lot of moving parts, as it were. There's a lot to pick up, both in things that are bleeding edge which are totally new to you as well as the over-time processes that have developed over time. That said, having worked in large organizations, I can honestly say it's amazing how much infrastructure this small crew supports and maintains. And everyone's happy to take some time and show you the ropes when you run into a rabbit hole that's new to you.
What’s the biggest misconception about RC or HPC in general?
I think one of HPC's biggest hurdles, perception-wise, is the notion that it's just a bunch of computers networked together. But there's monitoring, storage, virtualization, automation, more storage, esoteric and very, very specialized networking, maintenance, hardware provisioning, vendor relations... and that's before you even get to the software layer. And it all needs to perform 24/7 with a high degree of precision.
Given all the research conducted on RC’s Odyssey cluster, is there any one project that stands out for you?
I have a soft spot for Pan-STARRS (Panoramic Survey Telescope and Rapid Response System) because of my interest in image capture and it being one of the most complex gigapixel cameras on Earth. But the ATLAS (and CMS) project is near and dear as many of my friends and co-workers in Physics are involved with it and the LHC. There are few things more exhilarating than someone telling you, in the strictest confidence of course, "We're going to announce in the morning that we observed the Higgs."
If you could give RC users one piece of advice what would it be?
Check out our documentation and, if you don't find what you need, let us know. We very much want you to have the tools and information you need to get your work done. If it's at-hand, it lessens the time you're taken away from what's important, making science happen.
Settlers of Catan, Cards Against Humanity, or Carcassone?
Oh my god! How can you ask me this?? I'm such a game nerd! OK... I have to admit I haven't actually played Carcassonne, which is tragic. But I've watched it played on TableTop. I'm going to have to go with Cards Against Humanity, because I love the sort of dry, 'cartoon evil' humor that it employs. Gloom and Forbidden Island are also recent favorites. Little-known fact, I'm a Clue collector. I have over 40 different versions including an original Waddington's Cluedo and a special wood and glass collector's set.
Copyright © 2014. All Rights Reserved.
Information about how to reuse or republish this work may be available at Attribution.