Codemanship Exercise “Play-Through” – CD Warehouse Part #4

In yesterday’s post, I implemented the feature to allow record labels to send batches of CDs in to the warehouse. Our design now is starting to grow to a point where compromises have to be made in the design – meaning that, at times, we have to choose between, say, making sure classes have a single responsibility and making sure classes fully encapsulate their data.

It’s very important to make our code as clean as we can, but it’s also important to remember that perfection on anything more complex than the most trivial problems isn’t possible. We might aim for perfection, but we have to accept we’ll never reach it.

This is a nuanced message. Many developers see it as justification for not aiming for perfection – indeed, for not even trying to keep their code clean in some cases. Other developers, myself included, hear “Aim for the stars and you’ll reach the Moon”.

Perfectionism has been given a bad rap in software development. Managers in particular often have an irrational fear of developers “taking quality too far”. Anyone who’s dedicated a few years to looking at people’s code will know that nobody out there is producing software that’s too good.

Keeping code clean is hard. And the more the code grows and ages, the harder it gets: like trying to stay in shape in your forties is harder than in your thirties. Developers usually greatly underestimate how much attention and effort needs to go into keeping code easily maintainable, and teams typically under-invest in it.

Having said all that, it is possible to over-invest: to devote too much thought and effort to studying and refactoring our designs. Like in life: we need to invest in our health and fitness if we want to live a long and full existence. But we also need to get on with life – living, loving, reading, traveling, dancing, painting, snowboarding, and whatever else is the thing that gets us up in the morning.

Software’s a bit like that. We need to invest in giving it a long useful life. But we also need to get shit done. Finding that balance continues to be a subject of great interest (and frustration) to me. How much refactoring is too much? A how little refactoring is too little? Ultimately, what matters is the end result – how much value did we deliver, and for how long?

Anyway, enough navel gazing. Let’s pick up where we left off yesterday and get this shit done.

We’re nearly at the end of our list of tests for the CD warehouse.

  • Buy a CD
    • Payment accepted
    • Payment rejected
    • Out of stock
  • Search for a CD (by title and artist)
    • Warehouse has matching title
    • No match found
  • Receive batch of CDs
    • Of existing title in catalogue
    • Of new title (add to catalogue)
    • Of multiple titles
  • Review a CD
    • Just a rating
    • Rating and review

Just two more tests to pass. Okay. Now we know what CDs a customer has bought, we can move on to reviewing a CD. There are a couple of happy paths in our test list:

  • Review a CD
    • Just a rating
    • Rating and review

The simplest case is when they just leave a rating. Let’s write a failing test for that.

To pass this test, we simply push the new review on to the CDs reviews list.

Now let’s add text to our review in our next failing test.

This is easy to pass.

So that’s us done, then. Isn’t it? We’ve crossed off the last two tests on our list.

But there may be one or two little wrinkles we need to think about. Firstly, the requirements said that customers could leave reviews for CDs they’ve bought. What if we try to leave a review for a CD we didn’t buy?

We currently don’t record who bought a CD, and our model has no concept of customer. So there’s a couple more tests we need to add to our list. First, when a CD is bought, we need to make a record of it for the customer who bought it.

Passing this test is simples.

We’re on a green light again. Let’s review the code. I’m not happy about CD accessing the customer’s CD collection directly (or the test, for that matter). Let’s extract some methods and move them to Customer to encapsulate this.

Now that we know what CDs a customer has bought from us, we can add an edge case for when a customer tries to leave a review for a CD that’s not in their collection.

And to pass this, we just need to check if the customer has that CD.

Now we’re back on a green light, let’s review this new code. I’m not happy about the message chain I’ve left: review.customer.has(this). Let’s encapsulate that in the Review class. (This refactoring is called Hide Delegate).

Okay. We’re nearly there. One last question for our real customer: what are these ratings for? As it stands, a CD can have a list of reviews with ratings. But do we expect our users to go through the list adding them up to calculate an average rating? I think the software should do that. Let’s add one more failing test.

To pass this test, we just sum the ratings and divide by their number.

Okay. Back on a green light. Let’s do a mini code review. Well, I’m not convinced my code for calculating the average rating’s entirely obvious. Let’s try extracting a function for calculating the sum.

Next on our checklist is duplication. There is some duplicated set-up in the tests, but only a small amount, and we’re on the edge of the Rule of Three here, in that I could invest time in extracting set-up methods, but would that investment pay off considering that we’re actually done? I’m going to leave it in their, favouring readability.

All the methods and classes in our code are pretty simple – no loops inside loops inside IFs or anything like that.

But I do now think – yet again – that CD has more than one job now. Let’s extract a class for handling reviews.

In my design now, CD does one job, and simply acts as a facade to the other responsibilities, shielding any clients from knowledge of the little subsystem it’s now composed of.

Now we should check through the methods in our classes for any examples of Feature Envy we may have added. Nothing to report there. (Remember, there are examples of Feature Envy that we knowingly introduced as side-effects of splitting apart cohesive responsibilities so that classes do just one job.)

Next, let’s check that the dependencies in a classes are swappable by dependency injection. The quickest way to do this in JavaScript is to look for imports. Aside from a factory class, none of the other classes in our model have direct imports. All of their dependencies are given to them from the outside. And we can see that all the imports are at the top of call stack, in the tests. e.g.

Good modular designs wear their dependencies on the outside.

Finally, let’s check that non of our modules are referencing things they don’t use. In JavaScript, this again means checking the imports.

In this test module, WebStorm tells me that OutOfStockError is unused. Let’s organise our imports and get rid of it.

And that, folks, is my play-through of the CD warehouse exercise from the Codemanship TDD course. No doubt, you’ll have noticed some things while we did it. For example, in some places I chose a defensive style of programming, raising errors if pre-conditions for buying a CD and for leaving a review weren’t met. In other places, I assumed that the client code (e.g., the UI) wouldn’t invoke methods on the model if the precondition was broken (e.g., asking for the average of zero CD ratings).

This was based on edge cases that were either discussed with the customer or not discussed. If we agreed that only people who’ve bought a CD can review it, there’s multiple ways we can enforce that in the code. The way I prefer is to not give the end user that choice if it’s not valid. This can be done by good UI design, for example. Maybe the user is only presented with a “Review” button for CDs that are in their collection.

Anyhoo, that’s a discussion for another time.

 

If you’d like to see the complete JS code for my CD warehouse, it’s here.

 

Author: codemanship

Founder of Codemanship Ltd and code craft coach and trainer

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s