When interviewing developers, evaluating technical proficiency is a crucial part of the process. At Unabridged Software, we test for technical ability with a take-home project. A good project represents the work a candidate would do if hired. Most of our work involves building database-backed web applications from a generated list of business requirements. We also commonly work in Ruby and Ruby on Rails. So, our project tests a candidate's abilities along those lines with a similar stack.
Starting the Review
We usually allow candidates to complete the project over a weekend. When we get it back, we begin evaluating their work with a few simple questions:
- Does the application run?
- Was it set up in a relatively conventional way?
- Were any distinctive steps documented and simple to follow?
These initial questions should set a relatively easy bar to clear. We are looking for the candidate to have built, run, and tested their application while providing clear written steps so that others can run and test it too. Not only should the code run, but the app should work to the user requirements defined by the task. We don't expect that the solution is optimal, but we want to see that it meets the broad parameters from a user-facing perspective. One indicator of success is the presence of automated integration tests. Our project does not strictly require these, but they do show evidence of thorough testing. An applicant who tests user stories by hand would not be penalized, assuming that they do the testing. In the case that the applicant misunderstood a requirement, we try to strike a balance between teasing out the reason for that misunderstanding and assessing the code based on the requirements as they were understood.
Reviewing Database Schema
After completing the happy-path testing scenarios, we look at the applicant's database schema and models:
- Do the tables and columns make sense?
- Are the attributes of an appropriate data type?
- Are validations in place, and do they make sense given the task at hand?
- Are one-to-many and many-to-many associations set up correctly, and do those associations make sense within the domain?
- Can the applicant explain why they chose this schema?
- We also look at the RESTful routes and controller actions:
- Do the paths and verbs make sense from a resource perspective?
- Do the parameters and return types make sense?
There is room for subjectivity in answer to all of these questions. However, we frequently find the boundaries between systems (i.e., between the app and the database, or between the app and the internet at large) are good indicators of the quality of a candidate's work. Those decisions are often difficult to change later, so we look for a candidate to show care in how they initially set them up.
Reviewing Error States
Next, we turn our focus to error states:
- If I submit a form with fields in an error state, does the action succeed, or does the system validate my submission and then return a list of errors?
- Can I easily create cases where exceptions are thrown instead of being handled appropriately?
The best way to handle various errors depends on the needs of the user and the requirements of the application. In our evaluation task, we require that errors be handled but do not specify how. When evaluating error-handling, we are more concerned with whether the candidate handled errors reasonably than the particulars of the implementation.
Reviewing Organization and Readability
Given the model-view-controller pattern used by Rails and, given that business code must exist somewhere, how is the application organized? Ideally, the models handle representing domain objects, the views handle display (and do very little logic or querying on their own), and the controllers stitch it all together. We expect to see JavaScript, CSS, and HTML defined in the appropriate places. We also desire to find the HTML structured appropriately, and the CSS and JavaScript organized logically.
- Assuming the code is reasonably organized, how readable is it?
- Are variable, method, and class names meaningful?
- Does the code generally follow Ruby conventions (i.e., is it acceptably indented)?
- Are there explanatory comments anywhere the function or intent of some code might not be obvious?
Reviewing the Project Together
After reviewing the project ourselves, we ask candidates to demonstrate their application for us and walk us through the design decisions they made. This step in the evaluation mirrors how our developers interact with clients daily. We make allowances for the inherent stress of interviews, but expect that qualified candidates can discuss and explain the choices they have made.
After we've reviewed their project together, we ask the candidate to make a change. Often, we pair-program this piece with the candidate. In doing so, we expect that candidates will respond gracefully to feedback and that they will be able to act on that feedback.
Additional Items to Review
In no particular order, we also look for evidence that the candidate can:
- Use and interact with lists (arrays and ORM Collections).
- Use and interact with key/value pairs (hashes, dictionaries, maps, objects, etc.).
- Make use of methods on the String class.
- Create forms and successfully handle form submissions.
- Incorporate libraries to accomplish objectives (CSV parsing, file attachments, etc.).
- Perform ORM queries in a relatively efficient and idiomatic fashion.
- Write tests that validly and accurately describe the behavior.
- Use version control.
- Ask questions when requirements or details are unclear.
A Note on Seniority
For more senior developers, we look for more in-depth expertise with both our standard tech stack (Ruby, Ruby on Rails, React, PostgreSQL) and with their preferred tech stack. We also look for evidence that a senior candidate is willing and able to assume more leadership responsibilities such as project management and mentorship of junior developers.