Being a Researcher and an Engineer

What exactly is the difference between Research and Engineering in computer science? I've been reading a lot of different opinions about this topic, and nothing quite ever clicked for me. Lots of it was abstract without any hard details about what life actually meant in research vs engineering. While there are symptoms about what the differences are, product groups have time lines and customer bugs whereas research is protected from such work and research occurs in a bubble with a tech transfer back to product groups, there wasn't a very clear difference in what it meant to be a researcher versus an engineer.

After much time in both an industrial research lab at Oracle Labs, a Ph.D program, and working as an intern at Adobe Labs as an intern, I have some vague ideas about what the differences are. I think the biggest difference philosophically, is that the researcher must like the journey of discovery more than they like the sense of accomplishment.

Research is by definition, is something new. Research doesn't have to work, it doesn't have to be practical, it just has to be new. Taken to the logical extreme, imagine a new piece of knowledge that never existed before but has zero practical use. Great researchers have to not only be aware that this is a possible outcome, but has to be satisfied with this outcome. A researcher in the pure sense loves knowledge and the pursuit of it, for the sake of more knowledge, and not necessarily to accomplish a goal. Someone who reads wikipedia and clicks through every article with ADD is a fantastic example, because they love learning just because. Figuring out why things work, why things don't work, when things work, and what would be cool are traits of a researcher.

Engineering on the other, has to make something that works. In the real world, it means using less sexy technologies based on proven techniques and with a very high chance of success. The number of unknowns are constrained quickly to build things on time that work in most applications. Hacks that work are prized over elegant theories that might work. The end result and goal is more important than the process of figuring out how to solve the problem, although both are important. I think engineers are motivated more by admiring something that worked than necessarily the steps of how to get there. I've met a number of these people, the work horses who don't care what you want done, but they will get it done.

I don't mean to say that these two distinct camps cannot co-exist, and I don't think any single person sits at one extreme or the other. Everyone sits somewhere in the middle of the spectrum. I think the best research and the best engineering actually merge at the top. The best products both contribute something new as well as create something that works magnificently - From everything I've read, Tesla and SpaceX hit this target. The first iPhone probably required both large amounts of manufacturing research as well as engineering knowledge. The two sides met at the top to create something extraordinary.

However, this creates a conundrum for someone in the middle. What if you are equal parts research and engineering? There has been some discussion about this in other blogs, the concept of the research engineer, the applied researcher, or software engineers doing advanced development. These phrases are describing the person stuck in the middle, someone who knows how to do both research and engineering, but isn't quite comfortable in either camp. They love building things that work, but also have some ADD to build things that might never pan out. They love learning new knowledge and creating new knowledge, but also shoot for a goal with a high chance of success. I think most industrial research labs are moving towards this model, with Google leading the way.

If research is walking towards engineering, is the converse also true, that engineering is moving towards research? I think in Computer Science, distributed systems engineering is moving faster than research is. In many large scale internet companies, they have the resources and need to make tens of thousands of systems work together, something academia could never afford. Their end goals are driven by business needs and engineering is leading the way, probably with input from researchers. In that sense, my naive observation is that engineering is actually driving the field more than research is. I also get the sense that overall, engineering contributes more towards advancing the field than research does.

That's a very bold claim, that engineering drives the field more than research - the ideal notion would be the other way around. Research comes up with innovative new solutions to problems engineers couldn't quite fix yet. I think the problem is exploring a solution space - research is great at exploring a solution space, which means it comes up with a large amount of solutions that don't work. I think one of the best things I learned at Oracle Labs is from the research director Eric Sedlar. The conversation went something like:

Me: If we have a dead horse, why do we keep beating it and not try something else?
Eric: If you have a dead horse, you should study why it died so that we don't kill the next one.

Enlightening huh? The problem is, that like depression and happiness, there are a couple of known ways to be happy whereas everyone is depressed for a different reason. There are no journals or conference papers on approaches that don't work, only conference papers on techniques that do work. Nobody celebrates a dead horse as much as a horse that's winning. It's true that learning why a project failed is important, but few will celebrate it. Hence, a researcher has to like the knowledge more than the goal.

Engineering on the other hand, constraints itself quickly to techniques that are known to work. Building something that works is prized more than using new technologies to try something new. There is a business problem that needs to be solved and time is spent solving the problem in a quick manner.

After all this philosophical talk, is there a test you can take to determine if you're more of a researcher than an engineer? The only thing I can think of is would you rather learn or invent something cool that nobody cared about or something that works and solves a problem you have? Preferably you can do both, but if you had to pick one extreme, which would it be? I'm not sure it's the best test, but at least it's something to chew on.

Installing Good Defaults

Mentoring is such a hard and difficult facet of life. You want to pay it forward, you want to help someone grow, you want the best for them. I can only imagine this feeling is what having kids must be like, only much less intense. I agonize over the idea that my thoughts, my actions, actually influence someone. If that's the case, I must spend as much energy as possible to make sure I'm doing something great so the people who look up to me actually have something worth while to learn.

After much thought and discussion, I think I've finally managed to figure out a mentoring philosophy. Not a teaching one, but a mentoring one: install good defaults. We think about this all the time in computer science. Secure by default. Fast by default. Why don't we have the same concept when it comes to people? I think we do, just society calls it something different: values and morals.

When people grow up, when you analyze different cultures, you see different cultures optimize and cultivate different values and morals. It's how people act by default. For example, the default action in a restaurant in Europe seems to be take your time, enjoy the table wine, enjoy the food. If you are in a rush, you have to actively override social norms to do it, it's "rude". The momentum behind the idea of enjoying your meals is so strong that most people do it.

In asian cultures, the default is to value education almost more than anything else for the kids. That's why you see across the board, asian kids generally do better in every test compared to other cultures. It's valued so much that it becomes the default thing to do. If the running default theory in your family is that you will go to college, you most likely will. It takes an inordinate amount of effort to stand up to your parents, culture, and gut to say no.

Morals are another way society tries to teach people because it instills a default. Most people, thankfully, are appalled at the idea of killing someone. Whenever something is morally appalling, eg selling drugs (I don't find this morally bad, I'm just saying society does), it takes a concious effort to do it. Not only that, your initial reaction is to not sell drugs, which makes you very aware of what you're doing.

As a mentor, that's really what you're shooting for. You want to install great defaults for your mentees. The default reaction when something sucks is to recognize that it sucks and makes it better. The default action is to always be kind to other people. The default is to write clean code. It should take an inordinate amount of energy for someone to settle and say their work is good enough.

Other people allude to this property, it just never clicked as a mentor. There's an old Hindu saying "For the first 30 years of your life, you make your habits. For the last 30 years of your life, your habits make you". It's the same idea - instill great habits and great things will happen as an offshoot.

The Graduate School Admissions Process

After going through the 2008 recession, to see the money flourish again in 2010, we've had an interesting three years of interviewing potential graduate students. My hope is to guide potential graduate students through the admissions and interview process I've experienced at UCI.

The biggest difference between interviewing for a for profit company and a non-profit educational institution is that there is a fixed length term. Companies hire employees to produce for a few years, with at will employment. This means the person can leave at any time and the company can fire the employee at anytime. Getting a PhD involves the applicant attending school, and graduating in 5 - 6 years. This really changes the interview game because you aren't necessarily looking for someone who is productive from day one. Nor are you really looking for a specific set of technical skills because everything can change within six years. It forces you to look for fundamental character requirements and focus strictly on potential more than current skill, although current knowledge is always a bonus. Also, interviewing graduate students is also a relatively rare and recent opportunity for our lab. I don't remember interviewing any graduate students in 2008 and only a few in 2009. The UC budget cuts really limited the number of graduate students who could enroll and there wasn't enough money to commit to a new student because if a new student enrolled, current students couldn't finish their degree.

The actual process, from a prospective student's case is very much like what happens when you interview with a company. First, we take a look at your graduate application. If you look interesting on paper, we contact you and setup a Skype session. We talk with you in a very casual manner to see what you're interested in, some of your background, and a few technical questions. We also ask for a piece of code that you wrote and are particularly proud of. Systems software requires a lot of hacking skill, so we want to see what you can do. If the Skype session works well and you live in the United States, we fly you out for a few days, wine and dine grad student style, and see what topics come up. At this point, everything is pretty standard. The hard stuff is what defines "good enough" to grant admission?

Technical Skills

A really difficult question to answer is how much technical skill should an incoming graduate student know? Do we test based on their educational background? Is the bar higher if someone has a masters? What if they don't have an undergraduate degree in computer science, but have self taught themselves everything? As an educational institution, rather than a corporate entity, the expectation is that the person can learn fast enough to get a PhD in five years. We don't need specific technologies that are useful today. We don't even need compiler background experience because you can learn the basics in six months, which is a short time compared to the time it takes to receive a PhD. By definition, a university can and should spend more time training people.

Of course, there is a lower bound on what kind of student can really accomplish a PhD. Just like you would never let a baby drive a car, someone who has never hacked a single line of code in their life could never finish a PhD. You don't want to admit a student who is so under prepared that they would never finish a PhD. It's unfair for both the student and the university.

On the flip side, there isn't an upper bound. The exceptional applicant has already written their own virtual machine, their own garbage collector, and developed their own programming language. If you have done this, by all means please join us. The bar for a normal applicant is somewhere in the middle and setting that bar is really difficult. In our evaluation of a new applicant, they have to pass the lower bar, which is much lower than a corporations. A corporation like Google wants to ensure you can be productive and make them money within six months. The graduate school timeline is two years. We know that the first two years of your graduate career, you will not be productive. You'll be learning, taking courses, and if you're lucky touch some research. If you're really lucky, you'll have a paper. The technical question is, can you learn what you need to in two years? The lowest bar, requires the following technical skills:

  • Basic data structures and algorithms
  • The ability or experience of developing, maintaining, and testing programs. It doesn't have to be system software.
  • Some notion of software design / architecture.
  • The ability articulate technical issues - eg, describe a memorable or cool bug you've had.
  • Likes to hack - Probably the most important.


  • Compiler or system software experience.
  • Knowledge of garbage collection
  • C / C++ / Assembler 

It's a pretty short list and a low bar. Some may even say how could such people even possibly accomplish a PhD? Fundamentally, there is a philosophical difference in how we want to admit graduate students. People are here to learn a new skill. People are here to perhaps change their background. And a University should support these endeavors rather than try to make money or papers off these students. Fundamentally, graduate students are training and learning a new skill set. They aren't technically here to produce a product. Papers and software are a by-product of the graduate school experience. The main product is a graduate who has the ability to learn, the opportunity to learn, and the advice of your fellow students.

Culture Fit

Culture fit is one of the most important things we absolutely look for. Tony Hseih nailed it on the head when he built Zappos. You really need to like going to work, like your coworkers, and be able to hang out with them outside of a professional work environment. Likeability is especially important in graduate school because working in a lab sits somewhere between real friends and a professional corporation on the social connection scale. For example, do you need to invite everyone in your lab to a colleagues personal birthday party? It's ambiguous. In a real friend group setting, the answer is yes. In a corporate setting, the answer is no. In grad school, maybe.

Your colleagues live a 5 minute walk away from you. You see them everyday, live with them, and have to generally enjoy their company. If you can't get along, the next five years are ruined for everyone. We really look for people who we can actually be real friends with, not just professional friends. This means have a beer, ask for advice about not just professional issues but personal issues, and go to Vegas together. So what's the culture fit requirement? Be cool. Sorry to be so vague. (The be cool is my personal culture fit).


When a graduate student is admitted, they usually receive a multi year funding package. Funding is broken down by each academic year with each year paid for by a different department or entity, and thus a different pocket of money within the university. For example, UCI has three year packages for the average admitted graduate student. If you are highly ranked, you may get four years. Less ranked, two years. A standard three year package is broken down via:

  • One year paid for by the advisor.
  • One year paid by the CS department.
  • One year being a Teaching Assistant / Grader.

The remaining years, 4 - 6, are paid for by the advisor or you've already dropped out of graduate school. However, the important kicker is that after your second or third year, you should have advanced to candidacy. At this point, the tuition for a student is reduced to the same price as a domestic in state resident which is a key milestone because international students cost 2-3x more than an in state resident. The university reduces tuition because the university has already committed enough resources such that is has a vested interest in ensuring you graduate.

GRE scores questionably reflect an applicant's intelligence, but for the purposes of funding, they are important. Even if your Skype interview is great, we probably won't admit you unless your GRE scores are high enough to secure a department funding package. It simply costs too much to admit you.

On the flip side, if you are a stellar student, we might not admit you. If the department reserves a package for a student, but the student has a very high chance of going somewhere else such as Stanford, the money goes away if the student doesn't actually come to UCI. Most graduate students don't understand that the advisor matters more than the actual institution. And so many students who can get admitted to UCB but with a smaller package, will go to the more prestigious universities. UCI attempts to lure students with four year packages, but it doesn't always work. Instead, admissions try to target those students who would probably accept UC Irvine's offer, taking the middle chunk rather than the superb student.

For example, this year, we had a stellar student who currently has a masters degree. He has been working for a major industrial research lab for a few years and is going back to school to get his PhD. He already knew everything about virtual machines, had stellar code, and could probably be the post doc for the lab rather than an incoming graduate student. He got an offer just because he was fantastic, but we are reluctant to believe that he will actually come to UC Irvine. We'll see what happens.

Actual Admissions

Admission decisions are made by a committee comprised of all the university faculty. A specific advisor may want to admit you, but the admissions committee must agree to the admissions unless there are some extraordinary circumstances. If you are a stellar applicant and impress the faculty member, you will have a very very high chance of being admitted to the university. However, at the end of the day, the committee is the one who decides admissions, and therefore nothing is ever really final until the committee approves.

What I Look For

Disclaimer: This is what I, Mason, look for and is not indicative of the general lab policy. There isn't even an official lab policy rather than what's happened in the past. Everyone has their own thing they are looking for, their own style, and how they see "fit".

Coach Wooden - "Nobody cares how much you know (until they know how much you care)"

Personally, I try and follow Coach Wooden. I give a lot of leeway for not knowing anything about compilers, although it does make me question as to how and why you want to join a systems software lab. I care much more about whether or not a person likes to hack. How much they care about their code quality, how much they care about the presentation of themselves and their code, matters much more to me than any particular domain knowledge. One of the things I like to do is ask prospective students to send us some of their code. Just by looking at the code, you can see how much they care about programming. You can gleam how much experience they have and whether or not they are trying to improve the quality of their code.

Outside of their technical abilities, I'm also trying to see if this person is someone who is constantly improving themselves. If they are committed to improving their skills, and genuinely interested in the field, the get more bonus points than someone who is just applying. They may even be better off than someone who has some experience in systems software. If they have shown that they have the hard work ethic required to actually generate some results, awesome.

Of course, accurately gauging an applicant's ability to not only program, but project into the future whether or not they are suitable to accomplish a PhD is incredibly difficult. You will never really know, and each person is a bet.

Personally, I like voting for the underdog, because I'm a terrible interviewee and I've been given so many lucky breaks because I didn't know anything about VMs when I entered. I don't think I would be able to get through the interview process we have today because enough students have some compiler experience. I forgot where I read it, but some CEO of a startup said their engineering culture and interview process became so rigorous and tough that the original engineers didn't think they'd be able to get in. It hurts my ego, but I feel like we're at that point today. The students are getting better and better and that's for the best.

Good luck.

Side Note: International students, the informatics and software engineering department at UCI study software methods (eg waterfall method, agile development). It is not a major you should apply to, at least at UCI, if you want to do lots of programming.

P.S: Any feedback on how we should admit prospective graduate students is appreciated.

Crafting a Vision

Vision is the ability to articulate a future. The secret to vision is to twofold: Asking why, and asking why not? 

Asking why starts the process of understanding why something is the way it is. It's focused on the present and asks historical questions of how we got from point a to point b. Why is the web this HTML CSS, JavaScript mess? Why are fossil fuels such an important fuel in our economy? To understand any context, you need to read history and find out what happened and for what reasons. Otherwise you'll be bound to make the same mistakes over and over. 

Once you know why something is the way it is, you can create a plan for the future. Why not is a question that implies the future. Why not have a computer on every desk? Why not have all my data across all my devices? Why not have only digital books? Why not takes you from the present into the future. Vision emerges from why not.

Whatever ideal future pops into your head after asking why not needs to be articulated. That articulation is your vision for the future. And like all marketing, a vision has a few components.

Two Kinds of Vision

One kind of vision is jumping on the general trends and working on incremental improvements. For example, it's pretty easy to say that hybrid cars are the way to go, and that the United States will require more fuel efficient cars. If I was Toyota or Ford, I'd be working on more fuel efficient cars as people want cars that can go further on a gallon of gas. As Kevin Rose on Twit said, this is riding the current wave. It's also a lot easier to do well in. If the stock market is going up, you can pretty much pick anything and make money.

Most people have a vision of the first kind. They look at current trends and capitalize on them. Tons of companies were doing search prior to Google. They were even advertising with search, but Google had a better way of searching when search was becoming the big app of the internet. Dell computer rose so fast since they sold commodity computers directly through the web as everyone started needing a computer. You can be hugely successful riding a big wave.

The second kind of vision is creating a wave where nobody else is. Steve Jobs has done this multiple times. Everyone in the smartphone market was doing their own crappy phones until Steve Jobs came out and said here is a radically new kind of phone. Touch started the current smart phone innovation wave we see today. Microsoft created the computer business with a vision of a computer in every home. Amazon became the go to online retailer by instilling in us that shopping online for everything is viable. When you create something that changes your expectations and how you live, it creates a new wave. These are game changing visions and only few companies ever hit these high notes.

There is a rare version of the second kind of vision that cements you in history. This is when everything changes. The internet is a whole new kind of wave, a different kind of communication medium. Thomas Edison is famous for inventing an electricity distribution system. The Wright brothers for harnessing flight defined generations after them. These things only come once every few decades, but they are truly revolutionary. 

Vision is Social

The critical thing about vision is to understand how your change impacts society. Nothing, no matter how technically competent, how technically awesome, will ever actually be realized without understanding the social impact. Every vision must scratch a social itch. Every technical solution is just a means to somehow scratch the itch. Mobile devices feed our need to check anything, anytime, anywhere. We are addicted to the internet. An iPad lets me be lazy and surf the web in my bed. Cars solve the fundamental problem of becoming frustrated waiting too long to travel somewhere by horse. Electricity rids the annoyance of lighting a candle. If you aren't solving some kind of human problem, you don't have a vision. You only have a solution to a technical problem.

Vision is Scalable

Your view of a better world needs a scale. It doesn't matter what that world is, and at what scale, but you do need to define your scale.

Vision can be a very small and personal endeavour. I want to lose 10 pounds this year. The whole new years resolution tradition is giving yourself a one year vision. When you're asked: where do you see yourself in five years? Your answer is really a vision at the personal level.

Vision can scale up and start affecting other people. You want to become a great mentor and influence the young to achieve their potential. This kind of gift is of tremendous value, and lots of people really love being teachers for that reason. They love seeing their past students grow into respectable adults. This vision is at an inter-personal scale.

Thinking bigger, you can start to influence your industry. The Google Android team probably has a vision where everything can be done via speech to text technology. Google itself wants everyone to search for things on the internet in whatever way they feel most comfortable. They have a world that redefines how we find things online.

Scaling up starts getting into solving regional issues. Lowering drug use in New York City starts getting into community issues. Removing air pollution from Beijing prior to the Olympics is a regional issue. Still too, perhaps one day, the middle east will no longer be a hotbed of violence. These are concrete regional visions.

You could start thinking at the global level, or even higher. At the global level, every global leader at least verbally, would love to stop using fossil fuels as the main energy source. Eliminating a whole class of energy use is a vision at an amazingly high scale. 

NASA gets to have the grandest kind of vision since they are the only ones doing the things they do. Who else gets to study the universe? Who else gets to think of things like terraforming another planet. NASA could have a satellite in every neighboring galaxy by 2050. Or they could build a machine that sits on an asteroid and sends back data for the next 50 years. 

Vision scales, you can have multiple visions at each scale, you just need to know where on the scale your vision sits.

Vision Grows

Vision isn't stagnant. It changes with time and with market conditions. It has to be flexible enough to weather all the different fashions that may happen and still leave enough room to for organic growth. Zappos' vision is to have the best damn customer service in the world. However, when they started, it was probably something like "let's make buying shoes a great experience". It grew into have the best damn customer service in the world. Amazon probably didn't have the vision of becoming the de facto online shopping experience, including digital media. Jeff Bezos back in the 1990s probably just wanted to have all the books in the world. Like a person, your vision grows with time.  

Vision is Abstract

A vision has to fit into one sentence. I don't think any powerful vision that really spreads can be any longer since a vision requires that other people buy into it and act on it. You need people to not only want your change, but believe that your change can happen. You have to entice them and let your vision be loose enough that those who listen to it can visualize it. "Let's put a man on the moon". Simple, powerful, easily visualizable, and easy to act on.

If your vision is too concrete, people won't come up with ingenious ways to act on your vision. Zappos' vision of having the best damn customer service in the world lets their employees do awesome things like free shipping both ways and a no questions asked return policy. Bill Gate's philanthropy is not only amazing but simple: Improve health in the third world. You can imagine all kinds of ways to do it, its a powerful message, and you can easily visualize an African kid not being sticks and bones. 

Vision isn't Execution

Vision gives people a destination, but that doesn't mean you'll get there. You still have to execute. Vision and execution are two separate ideas. If you have a vision but can't execute, it doesn't matter. Someone else may come and execute your vision, your idea of the future may be so inevitable that it happens anyway, but at the end of the day, if you can't capitalize on it, what's the point? 

The actions with the most impact have execution and vision go hand in hand. Like Peter Drucker said:

"The best way to predict the future is to create it." - Peter Drucker


Other Links:

* Thanks to Christoph Kerschbaumer and Michael Bebenita for proofreading and feedback.

The Real Value of an Internship

Most people assume that the most valuable thing you get out of an internship is a full time job once you graduate. While I'd be silly not to assume that a full time job is extremely valuable, especially with 10% unemployment, there is a second more valuable aspect: The ability to explore.

Once you graduate and start working full time, you start getting paid. And money = a chain on your leg. You're limited to work on only the stuff that is important to the company, not your own personal development. Worse, it's socially unacceptable to leave a company if you've been there less than a year. Jumping is detrimental to your career, greatly limiting how fast you can explore new interesting things to work on.

Internships buy you the ability to try out different companies, different types of jobs, without the social penalty of leaving.

Discovery is so much more important than any pay check, immediate technical skills you learn, or people you meet. It lets you find out what you really like working on. Of course it's much easier to say it post undergrad and after being an intern eight years. No doubt a stable paycheck is a very tempting carrot. It's very difficult to say no to a *perceived* high paying job while in school. Anything more than minimum wage is considered a gold mine. No more ramen! Plus, you get the first real taste of freedom, the ability to tell your parents that they don't have to take care of you anymore. An amazing feeling, but at what cost?

I think two years after undergraduate, I can safely say that it sacrifices the long term for some short term gain. I know many friends who are giving up on their passions. You can see their dreams dwindle, suffering a slow painful death in the gutter. I look at the end result: my mother who just recently retired, my cousin who is in her mid thirties and countless others. When I compare them to others who still love their jobs, the only difference I see is that they spent the time to discover what they really liked to work on.

Which is the real problem internships solve. They let you discover the breadth of work available without the social stigma of being "uncommitted". The technical skills you learn will help you in the future. The people you meet will leave a mark, a lesson that you can carry with you. But at the end of the day, the feeling of working on something you love or hate is what's truly valuable. Because within three months, if you hated it, you can leave. If you love it, you've found a new path - and that's worth more than any full time job offer.


Finding an Internship

This section really didn't fit into the essay, but I know its difficult to find a regular job, let alone an internship. Tips:

  1. Use craigslist,, and any other job board and look for internships year round. Many are CoOps, which are really a fancy term for part time.
  2. If you're close to graduating (within a year), apply for a full time job and ask if you can start now, but part time. Worked for me. Just say it helps you reduce ramp up time :).
  3. The hiring window for most tech companies is between October - January. It's a lot harder to find a summer internship any other time.
  4. These rules seem to go out the window in graduate school.

Defining a Good Academic Paper

A "good" academic paper is defined as "presenting a new idea".  Half of a good paper is about content: how great is your idea? That alone is a difficult task, highly controversial, and a gut feeling.

The other half is how well the idea is presented. Every presenter, book and teacher can present new ideas in something you can understand, given enough time. But nobody has time, so a paper has to be incredibly easy to understand, which makes writing a good paper exponentially harder. A paper also has to explain not only the idea, but it's context. A ground breaking climate model is useless unless I know why I need one.

A "good paper", needs to "present a new idea in an easy to understand way".

Present doesn't say much nor does it imply any kind of usefulness for the reader. Press releases present something new in an easy to understand way, but most are useless. The whole point of a paper is to expose your idea to the rest of the community, so that they may be able to use it to solve their problems. Perhaps we should be trying to "teach" someone. But that isn't the case either as teaching implies a hierarchy - a teacher/student relationship.  In reality, they are colleagues that have equal weight, and you are hoping that they will like your idea. Which maybe means you are trying to sell a new idea. But "selling" has a horrific connotation of stuffing something down your colleagues throat. The only word that I could think of that implies respect, while still explaining an idea, is inform. A paper isn't trying to teach, or sell, but trying to inform a colleague of a new idea. We can change our definition of "good" to:

"Inform a colleague of a new idea in an easy to understand way"

This is a fairly abstract definition, but a good starting point. The definition of "new" is murky and depends on the field. How much stuff has to be new? Does the whole system have to be new? Does every paragraph have to present a new piece of information? I'm not sure, and I don't know how to define "new". My guess is, it'll be like what the U.S. Supreme Court said about explicit images: "I'll know it when I see it". Just make sure in your paper, the reader can see the "new" part. Put big circles, stars, lots of noise, and in big bold letters, the "new" part of your research.

The hardest part of writing a good paper is making it easy to understand. The most difficult part of making something easy to understand is culture. Even determining the responsibility of understanding something is cultural. In Outliers, Malcolm Gladwell explains the notion of "transmitter" versus "receiver" oriented communication. Most western or "transmitter oriented" cultures say that the writer has to ensure that the reader understands the message. In "receiver oriented" cultures such as eastern cultures, the person listening has to decode the message. Since most scientific papers are published in English, a "good" paper means it is the responsibility of the author to ensure that the reader gets it. The paper has to be proofread by many different people, rewritten, and edited many, many, many times. Most papers are difficult to understand because they are just written, not edited.

Culture also brings up the issue of writing style. I'm unfamiliar with how all Europeans write English, but American English is very different from English written in Germany/Austria. American English, at least after speaking with friends who majored in English, is good when it is concise. Short and simple. On the other hand, Germanic English, is concerned with being "precise", or adding lots of details. And by precise I mean that every thought related to the sentence is explained, in the sentence, with every nuance covered trying to eliminate all possible avenues where a point can be attacked, which of course creates sentences that are strung together to make a very long sentence without the use of periods, making each sentence difficult as an American to read. (eg. The JIT does xyz versus A JIT compiler can do xyz. We do escape analysis versus we do escape analysis for single threaded programs). American readers will hate your writing if you make it overly precise (me!) and Germans will hate your writing if you make it short and concise as your sentence is no longer "precise". I'm sure there is an "Asian" version of English, but I have no idea what that is yet.

The other non-culture hurdle of making something easy to understand is deciding on what is background information. How much can you assume someone knows? Most people guess too much. Dumping all the background information in the world makes a great textbook for an undergraduate course. Assuming everything will ensure that nobody knows what you're talking about. Try to imagine someone who has taken one or two courses in the subject and nothing more. If in doubt, give an extra sentence for background explanation. Don't feel bad if you scratch your head over what's background information because it's a really hard problem. (A website that contains a list of terms and definitions that everyone in the field should know has wiki potential written all over it.)

The easiest way to overcome these hurdles is to use examples because it solves the problem of abstract ideas, background information, and writing style. As much as scientists say they love the abstract, people can't think that well in the abstract. Self contained examples solve a lot of the background information problem. And many writing style issues go away if I already have experience with the example.

The best examples are the ones that start out with a simple case and slowly add new ideas, creating a holistic picture. It's hard to create one example to cover everything. But if you can't do it, your paper isn't focused enough or you haven't thought hard enough. Keep thinking.

A good academic paper, needs to "inform a colleague of a new idea through concrete examples". I've only written two papers, read too many, and I'm sure my definition will change with more experience, but that's seems to be better than most after two years in grad school.


Other Resources:

* Thanks to Christian Wimmer, Michael Bebenita, and Ali Haeri for proofreading.

Inside Grad School

This economy is tough with unemployment almost hitting 10%. I've seen this as some of my friends are without a job. Going to graduate school seems like a good alternative. I've been fielding a lot of questions regarding my experience. Maybe this can help anyone else whose thinking about going to graduate school.

The most surprising aspect with these questions is how little undergraduates really understand what goes on in graduate school. Yes they know that Teaching Assistances are grad students, but thats about it. I admit I am guilty of this. For some reason we always had this illusion that teachers and professors weren't somehow people. They came during their appointed hour, we students listened, end of story. It was weird to see them shopping in a grocery store. Naive yes, but a real thought.

Upon becoming part of this "grad school", it was stunning to see how many preconceived notions I had about graduate school. While reading tons prior to actually enrolling helps, there were still a few surprising things that nobody seemed to pinpoint. I hope by illuminating some of these gray areas, you may have a better understanding of what grad school is all about.

The first question people ask themselves is a Masters or a PhD? I grappled with this question for two years and didn't really have a good answer until my coworker at a startup I was working at said:

"Imagine yourself at a table for the next big project. The PhDs talk. The Masters take notes".

I'm not sure how true this actually is, but it seems the case. I'm sure there are exceptions, but this is probably the norm. You'd think that since PhDs are supposedly crazy smart, they wouldn't have this social structure. It would be a pure meritocracy. But alas social structure itself is outside of logic and is another essay unto itself. The difference, on a more formal level, is that masters "master" the current state of the art and apply them to current problems. Masters programs are like a continuation of undergraduate: more classes. Very rarely will you do research. PhDs on the other hand, research new problems creating the next state of the art. One isn't better than the other, you just have to know which one you are more comfortable with. Do you prefer dealing with very practical problems that have known solutions? Do you like problems with "right" answers?

And by "right" answers I mean correct. Up until undergraduate, everything you do has a clear right or wrong answer. Every test painfully lets you know when you're wrong. Are you terrified of being wrong? Go for a masters. Or do you love big gaping voids and putting a box around them? Are you comfortable knowing that you could be asking a question with no answer. Are you okay staying in limbo for a while? That's the plus and minus of a PhD. You have to make the choice for yourself. The worst case scenario is you go for a PhD and drop out with a token masters. If you go for the PhD, you get the masters along the way, which is just another brownie point on your resume. The rest of this essay assumes you are chasing the full PhD.

Which leads to the most important question you need to ask yourself: why are you doing it? Do you really want to work on something for at least four years? Would you work on this, or at least think about the subject for free? Now you may not even know what you want to study! That's fine. A lot of people think that you have to know exactly what you want to research from the get go. You don't. Nobody is asking you step in day one ready to do ground breaking, Nobel prize winning stuff. You only need to pick a subfield and see where it takes you.

How deep is a subfield? Your interests will change over the next five years so choose something broad enough that you can wiggle around in. For fields such as history, you may only need to know that you want to study American History or The Roman Empire. It's really broad, but thats the point. You want to stick your foot in a large enough pool where you can swim around in and find whatever eventually becomes your research topic. When I enrolled in graduate school, my interests were in programming languages, compilers, and virtual machines. Over the first year, it narrowed to virtual machines. After the second, virtual machines for dynamically typed programming languages. At the onset, you only need a broad topic that can narrow with time.

Of course this leads to the inevitable fear that you have to create something ground breaking new! You must write something that will revolutionize the field. That is what something "novel" and of "publishable quality" are. Right? Nope. I really overestimated how much new work is required for something to become publishable. Imagine a car. You may think you have to find a new body frame that has less drag and is twice as cheap to make. Or you have to find create the new economic model in this 2008 collapse. Not true. If you find one, more power to you, please remember me and give me a job as you stand and get your Nobel prize. In reality, a paper asks a very small narrow question and boxes it in a pretty little package. Instead of how do you find a new body frame for a car, ask, if you change the screw type from iron to diamond, how well does the tire bolt onto the body frame? A much smaller question.

And that sums up what you do in graduate school. You start out with this big field you like, and slowly become more specific with time. You do some research, then write a paper on it, and at the end write a thesis. It seems like the masters thesis is really just your first paper, doubled spaced with another background information section. The dissertation is the 4-5 papers you did over 5 years, double spaced with transitional paragraphs in between. Perhaps this is just something unique to UC. I'd be interested in seeing how other universities handle this.

If you're fear is that you aren't smart enough, aren't creative enough - don't worry. Everyone goes in just as dumbfounded. You may wonder, why are grad students so smart? Ask yourself, do you think you could TA the beginner courses in your major? Thats how. And a lot of success isn't based on intelligence, it is based on persistance. You only need to be smart enough, not the smartest. The only question is within 5-6 years, can you figure something out that you know you want to do? I don't know if there is a test that can answer this question, but I have a hunch. Do you naturally experiment in your own world? Do you sit around and think about your field? If it is natural for you, I suspect you will be fine in graduate school. All sold? Then the next question is:

How do you choose an advisor?

This choice will play the biggest role in your graduate school experience. I wish someone told me this earlier because you live and die by your advisor. Pick a bad one, and no matter what, you are going to hate your life. Your advisor pays you, determines when you graduate, determines what you can do, determines what you can study. Make sure you pick a good one. I've somehow been blessed with a terrific advisor. I don't know how I got so lucky, so I can't chime in here. Finding an advisor is kind of like a job interview - you never know what it's really like from the onset. If thats the case, what can you do? I don't know.

I chose my advisor by chance. I'm going to grad school at the same university I did my undergrad. I've been told this is bad practice. Too bad Irvine is too nice a place for me to move. I took my advisor's compiler class, got my first gray hair ever, and asked to do research. He said he didn't have cash to pay me, so I worked for free for 6 months. This gave me plenty of time to see if the group was for me, how everyone worked, and if I could see myself doing this for the next 5 years. It was like an unpaid internship. While I know it is painful to live on savings for 6 months, ramen and sleeping on friend's couches, it let me do two really important things:

1) Test drive the research group and advisor. Almost like an investment for the next five years.
2) Admission into graduate school on sub par credentials.

The dirty secret is that if a professor wants you as a student, it will happen (probably pending funding.. read UC funding situation to see why this isn't happening). The arrangement was mutual. I was pulled me through the admissions committee, my advisor got free labor. If you have no idea where to go, what to do, or how to contact a professor, try this: Offer yourself as a free slave for 3 months in summer and see what happens. I'd pick summer over any other semester/quarter as summer is the time where you can actually get lots of research done. It's the purest research time in that sense as no classes are going on. I don't actually know how many professors would take the deal, free labor. But it's something that happens rarely, shows that you really do have enough passion, which is a massive plus, and it's crazy enough that it may work. Try it and let me know.

Even if you stun your coworkers and your advisor loves you, you still have to do a few things to be admitted into grad school. Like take the dreaded GRE - the test that doesn't test anything other than how much can you cram. If you are shooting for 1400+/1600, you need at least three months of 1-2 hours of studying everyday. Tip for the math: Always test the edge cases. The GRE math is full of tricks, not difficult math. Eg they have pick A if column A is bigger, pick B is column B is bigger, C if they are equal, D if not enough information, plug in the edge cases - 0, 0.1, 1, 2, 100000, -0.1, -1, -2, -1000000. Then you can see the tricks. The verbal is just memorizing words. Memorizing about 1500 words instead of all 4000 from those Barrens books is good enough. Then learn the Latin roots, pre and post fix modifiers. For most questions, you probably already know 1 or 2 words out of the five multiple choice. Once you learn the 1500 barrens words, you'll know another 1-2. This way you can usually eliminate at least 3/5 choices giving you a 50/50 chance of guessing. Optimizing away the GRE. When should you take it? I remember a freshman class where professor Stephen Jenks said something like "Take it your senior year of college. The scores are good for five years and you're still in the test taking mode."

Why should you even study the GRE? Didn't I just say if a professor wants you, you're on your merry way? Yes but your GRE score gives your advisor leverage during the admissions and funding process. At least at UC Irvine, admissions are done by committee. If your advisor likes you, admissions are probably the easy part. Funding is the hard part.


This is why professors work incredibly hard to get funding and grants. They pay you. The toil you hear everyone bitch about, constantly going around asking for money is oh so true. Horrible I know. I wish we could revamp the system but alas it is what it is. Studying for the GRE and keeping your grades up is all about helping your advisor convince the other professors to give you an admissions "package". A package says "you are guaranteed funding for x number of years". The higher x is, the better off you are, especially in an economic downturn. If your advisor runs out of money, you either work for free or drop out of grad school. Funding can be a massive headache. If you put in 3 months to cram the GRE in exchange for 4 years of relative sanity, its probably worth it. The four years are paid through multiple channels - your advisor, the department you are in, the actual school, and being a teachers assistant. The remaining years are paid by your advisor through grants or you becoming a TA. Being a TA means your tuition is paid and you get a small stipend. Very small. At least you aren't paying for school.

There are other ways to offset the financial situation, at least in the sciences. I'm somehow in this amazing situation where my research and industry interests magically align. I was admitted in 2007 and started doing trace compilation on JavaScript because I didn't want to work on Java. Lo and behold, Browser War 2.0 based on JavaScript speed became the rage. I have no clue how this happened. I'm just grateful it happened. This also means it's possible to get an internship doing research. Although I hear this is super rare, especially outside of the sciences. But it is possible to still have a taste of industry while being in Academia. This also helps supplement your puny TA stipend. Again this truly depends on your advisor. If your advisor doesn't want you to intern, then you aren't going to. Or if you are an international student, say hello to the INS and their Visa issues. My only advice would be to do internships that align somewhat with your research interests. Otherwise you'll be in graduate school even longer.

Which leads to a part of graduate school nobody ever told me - the social aspects. Everyone always focuses on the research aspects, the technicalities. But there is only so much research you can do. You need social interaction to stay sane and graduate school is weird. You are still piss poor, like a continuation of undergraduate. You eat $5 footlong subway sandwiches. Its hard to spend $20 on a meal. You drive a beat up Corolla while your friend drives a BMW with his girlfriend/soon to be wife driving an Infiniti. You're in this awkward bubble and you're no longer "normal". There are some nice perks. Everyone you talk to is smart and can have a conversation about really interesting topics such as politics. Jon Stewart is everyone's hero. You're on your own schedule with the ability to wake up at noon, sleep at 2 AM, and work without a dress code anywhere, anytime. Almost all your peers are international students, giving you a really diverse and interesting chance to explore the world from a different point of view. But you lose some sense of what its like outside of this bubble. Talking to people outside of graduate school makes it a bit harder to relate to them. They are working on their careers, climbing the corporate ladder, and planning their families. You are planning... your dissertation, something few can relate to. It takes a bit more social skill to relate to them, and it's difficult to start meeting people outside of grad school. It's still hard to meet undergraduates as you are that ambiguous "grad student". I highly recommend having friends outside of grad school. It keeps you sane.

Which is really important. Graduate school is a marathon, not a sprint. It's important to stay healthy allowing you to take the time to figure out what you want to do. Because what happens once you graduate? You can either become a professor, stay in Academia, or go into industry. This has been an age long debate and really varies with people. There are three options:

1) Become a research professor
2) Become a teaching professor
3) Go into industry

Being a research professor means you teach on the side and write grants. It is extremely difficult to become one. The smartest hacker I know couldn't get a job as a professor which humbled me. If he couldn't get one, what are the chances of me getting one? Not only that, a research professor seems more like a manger of a small group with a bit more leeway. You don't research anymore. You travel all the time, write grants, and find money. I already hate traveling. Planes suck. If this sounds great, perhaps a research professor is right for you. The other option, becoming a teaching professor seems a lot more amiable. You teach and do research if you want, but its not the focus. You are paid a salary to teach and don't have to hunt for grants. Industry is the third option. Hopefully you don't work at a company where you are a glorified employee, but instead get into a cozy research lab. I haven't graduated yet, leaving me little to say about it.

All in all, I simply adore my time here. Perhaps I'm a rarity as a lot of people will say "don't do it". Jorge Cham from PhD Comics gave a talk at UCI and one undergrad asked "should I go to graduate school?" About half the audience SCREAMED "Don't do it!". I know one person who said she hated her experience leaving me with no idea why she stayed.

Should you do it? Grad school opens massive amounts of doors. I got to have lunch with Brendan Eich, the CTO of Mozilla and creator of JavaScript. Call me star struck because I am. It's freaking awesome. And that sums up grad school. You get to hang out with Brendan Eich. Do you get to if you don't go to graduate school? Who knows.

An older coworker of mine told me - "You never regret what you did. You regret the things you didn't do". Try grad school and see - worst case you drop out and get a job.

Good luck and feel free to ask me any questions or chime in with your own experiences. (especially if you're not in the sciences. I'd like to hear your experience.)


Thanks to Joshua Shapiro, Michael Bebenita, Christoph Kerschbaumer, Jon Nguyen, and Min Hur for proofreading and their feedback.