Category Archives: Programming

Writing a Play 2.0 Module

The intro

Hot off the press, Play 2.0 has arrived, and has been welcomed into the arms of a fast paced community that loves new things.  The day it was released, we started a new project here, and on that particular day and on this particular project we felt particularly daring.  So we decided to use Play 2.0 for our project.  New is an understatement for Play 2.0, it’s not just an incremental improvement on Play 1.x, many parts of it have been completely rewritten.  There is still much work to do, and one of the blaring gaps that has yet to be filled is modules.  At the time of writing, there is no official listing or repository of modules for Play 2.0, in stark contrast to the rich ecosystem of modules for Play 1.x.  Furthermore, there is no documentation on how to write a module.  So, given that modules tend to be very useful, and we were starting a new project, we very quickly ran into the need to write our own module, which we did, the MongoDB Jackson Mapper Play 2.0 Module.  To help the rest of the community of early Play 2.0 adopters, I’ve decided to write a (very) short guide on writing Play 2.0 modules.

The disclaimer

So did I mention that there was no documentation on writing modules, and very little in the way of example code to copy from?  What I’ve written may well be not the right way to do things.  But with no documentation, how am I to know?  All I know is that it’s working for us, and that’s good enough for me.  So if you happen to know what the right way to write Play 2.0 modules is, don’t bother commenting on this telling me that I’m wrong.  Just write the damn documentation!

The setup

In play 1.x, writing a module usually starts with running play new-module. Slight problem here:

$ play new-module
       _            _
 _ __ | | __ _ _  _| |
| '_ \| |/ _' | || |_|
|  __/|_|\____|\__ (_)
|_|            |__/ 

play! 2.0, http://www.playframework.org

This is not a play application!

Use `play new` to create a new Play application in the current directory,
or go to an existing application and launch the development console using `play`.

You can also browse the complete documentation at http://www.playframework.org.

Ok, so that doesn’t work.  Looks like there’s no way to create a new play module.  So, I decided to simply write a vanilla SBT project.  I won’t go into the details of how to set a new SBT project up, but here’s the play specific bits that you’ll need:

resolvers ++= Seq(
    DefaultMavenRepository,
    Resolver.url("Play", url("http://download.playframework.org/ivy-releases/"))(Resolver.ivyStylePatterns),
    "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/",
    "Typesafe Other Repository" at "http://repo.typesafe.com/typesafe/repo/"
)

libraryDependencies += "play" %% "play" % "2.0"

libraryDependencies += "play" %% "play-test" % "2.0" % "test"

Now do whatever you need to do to open it up in your favourite IDE/editor (I personally use IntelliJ IDEA, so I use the sbt-idea plugin).

The code

It’s worth first noting that the module that I wanted to write only had to load a MongoDB connection pool, according to configuration supplied in application.conf, and manage the lifecycle of that pool.  The trait that needed to be implemented to do this is play.api.Plugin.  It has three methods, onStart(), onStop() and enabled().  If you’re looking for information on how to do things like intercept HTTP calls and define custom routes, then I suspect it’s easy to do (probably by just defining a routes file in your plugin), but I didn’t need to do that so I’m not going to pretend that I know how.

There’s not really much to say now, my plugin looks something like this:

class MongoDBPlugin(val app: Application) extends Plugin {
  private lazy val (mongo, db, mapper) = {
    // insert code here to initialise from application config
  }

  override def onStart() {
    // trigger lazy loading of the mongo field
    mongo
  }

  override def onStop() {
    mongo.close()
  }

  override def enabled() = !app.configuration.getString("mongodb.jackson.mapper")
      .filter(_ == "disabled").isDefined
}

The actual code does a fair bit more than this, but none of that is specific to how to write a plugin.

The finishing touch

So I’ve written my plugin, there’s now one thing left to do… tell play framework about it!  This is done by defining a play.plugins file in the the root of the classpath (in the resources folder):

1000:play.modules.mongodb.jackson.MongoDBPlugin

The leading integer defines the priority of the plugin to be loaded.  The example I saw used 1000, so I decided to use that too.

The resolution

Now all I have to do to use this module is add it as a dependency.  Play will automatically pick it up, and it will be automatically started and stopped as necessary.  Happy hacking, and if you’re starting a Play 2.0 project, and want to use MongoDB, why not try the MongoDB Jackson Mapper Module!

Extending Guice

Guice is a framework that I had been looking forward to trying out for a while, but until recently I never had the opportunity.  Previously I had mostly used Spring (with a dash of PicoContainer), so when I got the opportunity to start using Guice, I naturally had a number of my favourite Spring features in mind as I started using it.  Very quickly I found myself wanting an equivalent of Springs DisposableBean.  Guice is focussed on doing one thing and doing it well, and that thing is dependency injection.  Lifecycle management doesn’t really come into that, so I am not surprised that Guice doesn’t offer native support for disposing of beans.  There is one Guice extension out there, Guiceyfruit, that does offer reasonably complete per scope lifecycle support, however Guiceyfruit requires using a fork of Guice, which didn’t particularly appeal to me.  Besides, Guice is very simple, so I imagined that providing my own simple extensions to it would also be simple.  I was right.

Though, to be honest, while the extensions themselves are simple, it wasn’t that simple to work out how to write them.  On my first attempt, I gave up after Googling and trying things out myself for an hour.  On my second attempt, I almost gave up with this tweet.  But, I stuck with it, and eventually made my breakthrough. The answer was in InjectionListener. This listener is called on every component that Guice manages, including both components that Guice instantiates itself, and components that are provided as instances to Guice.

Supporting Disposables

So, I had my disposable interface:

public interface Disposable {
  void dispose();
}

and I wanted any component that implemented this interface to have their dispose() method called when my application shut down.  Naturally I had to maintain a list of components to dispose of:

final List<Disposable> disposables = Collections.synchronizedList(new ArrayList());

Thread safety must be taken into consideration, but since I only expected this list to be accessed when my application was starting up and shutting down, a simple synchronized list was suffcient, no need to worry about performant concurrent access.

My InjectionListener is very simple, it just adds disposables to this list after they’ve been injected:

final InjectionListener injectionListener = new InjectionListener<Disposable>() {
  public void afterInjection(Disposable injectee) {
    disposables.add(injectee);
  }
};

InjectionListener‘s are registered by registering a TypeListener that listens for events on types that Guice encounters.  My type listener checks if the type is Disposable (this actually isn’t necessary because we will register it using a matcher that matches only Disposable types, but it is defensive to do the check), and if so registers the InjectionListener:

TypeListener disposableListener = new TypeListener {
  public <I> void hear(TypeLiteral<I> type, TypeEncounter<I> encounter) {
    if (Disposable.class.isAssignableFrom(type.getRawType())) {
      TypeEncounter<Disposable> disposableEncounter = (TypeEncounter<Disposable>) encounter;
      disposableEncounter.register(injectionListener);
    }
  }
}

Now I can register my TypeListener.  This is done from a module:

bindListener(new AbstractMatcher<TypeLiteral<?>>() {
      public boolean matches(TypeLiteral<?> typeLiteral) {
        return Disposable.class.isAssignableFrom(typeLiteral.getRawType());
      }
    }, disposableListener);

The last thing I need to do is bind my collection of disposables, so that when my app shuts down, I can dispose of them:

bind((TypeLiteral) TypeLiteral.get(Types.listOf(Disposable.class)))
                .toInstance(disposables);

So now when my app shuts down, I can look up the list of disposables and dispose of them:

for (Disposable disposable : ((List<Disposable>) injector.getInstance(
    Key.get(Types.listOf(Disposable.class)))) {
  disposable.dispose();
}

If you decide to use this code in your own app, please be very wary of a potential memory leak. Any beans that are not singleton scoped will be added to the disposable list each time they are requested (per scope).  For my purposes, all my beans that required being disposed of were singleton scoped, so I didn’t have to worry about this.

Supporting annotation based method invocation scheduling

Happy that I now had a very simple extension with very little code for supporting automatic disposing of beans, I decided to try something a little more complex… scheduling. My app contains a number of simple scheduled tasks, and the amount of boilerplate for scheduling each of these was too much for my liking. My aim was to able to do something like this:

@Schedule(delay = 5L, timeUnit = TimeUnit.MINUTES, initialDelay = 1L)
def cleanUpExpiredData() {
  ...
}

(Yep, this app has a mixture of Scala and Java.) So, I started with my annotation:

@Target(ElementType.METHOD)
@Retention(RetentionPolicy.RUNTIME)
public @interface Schedule {
    long delay();
    TimeUnit timeUnit() default TimeUnit.MILLISECONDS;
    long initialDelay() default 0;
}

The main difference this time is that I’m not listening for events on a particular type, but rather I want to check all types to see if they have a @Schedule annotated method. This is a little more involved, so I’m going to have a scheduler service that does this checking and the scheduling. Additionally it will make use of the disposable support that I just implemented:

public class SchedulerService implements Disposable {
  private final ScheduledExcecutorService executor = Executors.newSingleThreadScheduledExecutor();

  public boolean hasScheduledMethod(Class clazz) {
    for (Method method : clazz.getMethods()) {
      Schedule schedule = method.getAnnotation(Schedule.class);
      if (schedule != null) {
        return true;
      }
    }
    return false;
  }

  public void schedule(Object target) {
    for (final Method method : target.getClass().getMethods()) {
      Schedule schedule = method.getAnnotation(Schedule.class);
      if (schedule != null) {
        schedule(target, method, schedule);
      }
    }
  }

  private void schedule(final Object target, final Method method, Schedule schedule) {
    executor.scheduleWithFixedDelay(new Runnable() {
      public void run() {
        method.invoke(target);
      }, schedule.initialDelay(), schedule.delay(), schedule.timeUnit());
  }

  public void dispose() {
    executor.shutdown();
  }
}

Now in my module I instantiate one of these services:

final SchedulerService schedulerService = new SchedulerService();

I then implement my InjectionListener:

final InjectionListener injectionListener = new InjectionListener() {
  public void afterInjection(Object injectee) {
    schedulerService.schedule(injectee);
  }
}

and my TypeListener:

TypeListener typeListener = new TypeListener() {
  public <I> void hear(TypeLiteral<I> type, TypeEncounter<I> encounter) {
    if (schedulerService.hasScheduledMethod(type.getRawType()))  {
      encounter.register(injectionListener);
    }
  }
}

And then all I have to do is register my type listener, and also my scheduler service (so that it gets disposed properly):

  bindListener(Matchers.any(), typeListener);
  bind(SchedulerService.class).toInstance(schedulerService);

Conclusion

Although Guice doesn’t come with as much as Spring does out of the box, it is very simple to extend to meet your own requirements. If I were needing many more container like features, then maybe Spring would be a better tool for the job, but when I’m just after a dependency injection framework with a little sugar on top, Guice is a very nice and much lighter weight solution.

You did what in Scala?

This morning when I got into work, the first thing that anyone said to me was “you did what in Scala?”  Not the usual greeting I get in the morning… clearly I had stirred something up.  I knew exactly what this person was talking about, the evening before I committed some code, and then tweeted this:

Just added scala for the first time to an existing Java project. Not too shaby.

As soon as I saw the build passed on our CI server, I went home, but it caught the attention of my product manager, and he was very intrigued.  What I had in fact done was started writing unit tests for an existing Java service that I was working on in Scala.  Why did I do this?  A number of reasons:

  1. I’ve been meaning to learn Scala for at least a year.
  2. I’ve seen Scala unit tests before, and they look very cool, they’re very good at minimising boilerplate, and very easy to read and understand.
  3. At VZ, we are free to make sensible technology choices.  This ranges from what libraries we use, to what databases we use, to what languages we use.  Nothing is off limits, as long as we can provide a good argument as to why it’s better than the alternatives.  And when we do that, our managers trust us.

My product manager of course had no problems with me using Scala, we have another project here that uses Scala and he thought I meant I had done some work on that, and was interested in knowing why.  After explaining that I had actually added Scala to the project I was supposed to be working on, he was completely fine, and that’s one of the things I love about working for VZ, we have the freedom to make our own decisions.

For those that are not familiar with Scala, here is a quick overview of how I introduced Scala into my existing Java project.

First, I did my research.  What unit testing frameworks are there in Scala?  You’ll quickly find that there are two popular frameworks, one called specs, and another called ScalaTest.  ScalaTest supports a number of different testing styles, including TDD and BDD, while specs only supports BDD.  I only wanted BDD, so both were equal to me at this point.  Further research showed that specs has good integration with my favourite mocking framework, Mockito, so I went with specs.  I suggest you do your own research for your own purposes, my comparison here is far from complete.

Next, since I’m using Maven, I needed to add Scala to my maven project.  I found a blog post that explained how to add Scala to a maven project in 4 steps, and I was able to build my project in no time.  I also added a dependency on the specs library, and configured the Maven surefire plugin to run any classes ending in Test or Spec, as per the instructions for integrating with Maven and JUnit in the specs documentation.  I use IntelliJ IDEA as my IDE, so I searched for a Scala plugin in my preferences, found one, installed it, and after a restart IDEA had Scala support.  The IDEA instructions say that you need to install the Scala SDK, but since I was using Maven, I could just add the scala compiler as a provided maven dependency, then go to the Scala compiler preferences and point IDEA at that dependency.

Finally I had to write my tests.  Below is the first test that I wrote.  If you’re a Scala guru, I’m sure you’ll see things that I could have done simpler or that I haven’t followed conventions for, so I’m happy for you point them out to me, I’m still learning.

class WorkResultHandlerSpec extends SpecificationWithJUnit with Mockito {
  "Work result handler" should {
    val tracker = mock[WorkResultTracker]
    val handlerChain = mock[HandlerChain]
    val workUnit = WorkUnit.builder(JobType.TEST_MESSAGE, null).build
    val job = Job.builder(JobType.TEST_MESSAGE).build
    var handler = new WorkResultHandler(tracker)

    "call handler chain only once" in {
      handler.handle(job, workUnit, handlerChain)
      there was one(handlerChain).passToNextHandler(job, workUnit)
    }

    "pass the result to the tracker" in {
      val workResult = WorkResult.success
      handlerChain.passToNextHandler(job, workUnit) returns workResult
      handler.handle(job, workUnit, handlerChain)
      there was one(tracker).trackWorkResult(JobType.TEST_MESSAGE, workResult)
    }

    "return the result" in {
      val workResult = WorkResult.success
      handlerChain.passToNextHandler(job, workUnit) returns workResult
      handler.handle(job, workUnit, handlerChain) mustEq workResult
    }

    "track an exception as a failure" in {
      handlerChain.passToNextHandler(job, workUnit) throws new RuntimeException("Something bad happened")
      val workResult = handler.handle(job, workUnit, handlerChain)
      workResult.getStatus.isSuccess must_== false
      workResult.getMessage mustEq "Something bad happened"
      there was one(tracker).trackWorkResult(JobType.TEST_MESSAGE, workResult)
    }
  }
}

Testing permutations of configurations

Permutations are an every day part of VZ development.  Most obviously, we have three different platforms, each with different names, different base URLs, different wordings and of course different colours.  But then we also have two different languages that we currently support, English and German.  On top of that we often do AB testing, where we’ll have different variants of the same feature displayed or implemented in slightly different ways, and we present the different ways to different users and then gather metrics to see which ways the users seem to prefer.  Finally, there are times where you have different functions, but the functions share much of their functionality.  You end up with a massive list of permutations of different ways the code can be executed, too many to ever test manually, and too many to write and maintain individual tests for.

The service that I’ve been spending a lot of time on at VZ is what we call the “notificator”.  It is responsible for generating all the HTML emails that the platform sends, from registration emails through to new message notifications, event invites, photo comments etc.  Each notification type shares a lot of its functionality with the other notification types, the emails all look very similar, and sometimes only differ by what resource keys are used to generate their wording.

There are many bugs that could be introduced in this system.  Here are some examples of things that I want to and can automatically test:

  • All generated HTML is valid markup
  • All keys that the templates use exist in our resource bundles.  When a key doesn’t exist, text like this ends up in the email: ???new.comment.action???
  • All URLs in links and images are absolute URLs and are to the right platform
  • Standard headers and titles in emails are correct

There are also many specific things for each notification type that I want to test.  The requirements I have mean that I can’t just run the tests for one notification type, or for one language, or for one platform, or for one AB testing variant.  I have to run the tests for every permutation of these.  Writing these tests manually would be a nightmare.  Fortunately, JUnit has a few features that can help us here.

Setting up configurations

Before we go into the details of how to use JUnit to help, we need to set up representations of our configurations. This can be most easily done using enums. For language, variants and platforms, we can use quite simple enums:

public enum Language {
  GERMAN("de"), ENGLISH("en");
  public final String abbreviation;
  Language(String abbreviation) {
    this.abbreviation = abbreviation;
  }
}
public enum Variant {
  SMALL_PROFILE_PICS, LARGE_PROFILE_PICS
}
public enum Platform {
  SCHUELERVZ("http://www.schuelervz.net"), STUDIVZ("http://www.studivz.net"), FREUNDEVZ("http://www.freundevz.net");
  public final String baseUrl;
  Platform(String baseUrl) {
    this.baseUrl = baseUrl;
  }
}

Sometimes these enums might already exist in some form in your code already, or you’ll have to make them up specifically for the tests. Using ones specific to your tests have the advantage that you can add meta data that is important to the tests to them, as I’ve done above with the base URLs for the platforms.

For the notification types, I wanted a bit more functionality, for example, code to run notification type specific assertions. There are many ways this could be implemented, I decided to do it using anonymous classes in an enum, implementing a method that accepts a jsoup Document to run assertions on:

public enum NotificationType {
  NEW_MESSAGE(new MessageData("Test Subject", "Test content")) {
    public void runAssertions(Document body) {
      assertThat("Test Subject", equalTo(body.getElementById("subject").text()));
      assertThat("Test content", equalTo(body.getElementById("content").text()));
    }
  },
  GRUSCHEL(new GruschelData()),
  ...
  public final Object testData;
  NotificationType(Object testData) {
    this.testData = testData;
  }
  public void runAssertions(Document body) {
  }
}

Using JUnit parameters

Now that I’ve got the different configurations, I can write a test that JUnit will run for every permutation of configurations. For my first attempt, I’m going to use JUnit parameters. This is by far the simplest way to do things. The first thing to do is declare the runner for the test class:

@RunWith(Parameterized.class)
public class EmailGenerationTest {

Now I can set up my permutations. The way the JUnit parameterized runner works is you annotate a method with @Parameterized.Parameters, and that method must return a collection of object arrays, each nested array being the set of arguments to pass to the tests constructor for each permutation. I’m going to implement this like so:

private final Variant variant;
private final NotificationType type;
private final Platform platform;
private final Language language;
public EmailGenerationTest(Variant variant, NotificationType type, Platform platform, Language language) {
  this.variant = variant;
  this.type = type;
  this.platform = platform;
  this.language = language;
}
@Parameterized.Parameters
public static Collection<Object[]> generateParameters() {
  Collection<Object[]> params = new ArrayList<Object[]>();
  for (Variant variant: Variant.values()) {
    for (NotificationType type: NotificationType.values()) {
      for (Platform platform: Platform.values()) {
        for (Language language: Language.values()) {
          params.add(new Object[] {variant, type, platform, language});
        }
      }
    }
  }
  return params;
}

Finally I can write my tests. Each test method that I write will be run once for each permutation of parameters that I have generated.

@Test
public void noResourceKeysShouldBeMissing() {
  String html = ...// code to generate email given the parameters
  assertThat(html, not(containsString("???")));
}
@Test
public void notificationSpecificAssertionsShouldPass() {
  Document body = ...// code to generate jsoup document of the email given the parameters
  type.runAssertions(body);
}
...

This works very nicely, I can add new notification types, variants, languages and platforms, and I only have to change my tests in one place specific to that configuration.  I can also add new general tests to one place, and they get run for every permutation. However, there is one problem. JUnit names each set of parameters with a sequential number. Working out which number relates to which permutation can be difficult, especially considering that we are dynamically generating the parameters.  Here’s an example of what such a test run looks like in IntelliJ IDEA:

Parameterised test run in IntelliJ IDEAYou can see that I don’t get much information. Maven test runners are also similarly unhelpful. However, there is another strategy you can use to make sure you have the right information about failures.

Custom suites

This method is quite involved, if you only have a handful of permutations, it’s certianly not worth it. In my case I have many hundreds of permutations, and so it’s invaluable. The idea is that for each configuration type, we have a custom test suite. These get nested together to form our permutations. What we can do with these is give each a name according to which configuration parameter it’s for, and so we can easily work out which permutation of configurations failed. To start off with, I’m going to write an abstract runner that simply has a name and a list of child runners.  This will be the building block for my tree of runners.

public static class NamedParentRunner extends ParentRunner<Runner> {
  private final List<Runner> runners;
  private final String name;
  protected NamedParentRunner(Class<?> klass, List<Runner> runners, String name) throws InitializationError {
    super(klass);
    this.runners = runners;
    this.name = name;
  }
  protected List<Runner> getChildren() {
    return runners;
  }
  protected Description describeChild(Runner child) {
    return child.getDescription();
  }
  protected void runChild(Runner child, RunNotifier notifier) {
    child.run(notifier);
  }
  protected String getName() {
    return name;
  }
}

Now I’m going to write a test runner that will instantiate each test and run the methods on it.  I’ll extend the existing JUnit class runner because I don’t want to reimplement all the logic to do with looking up methods:

private static class TestRunner extends BlockJUnit4ClassRunner {
  private final Variant variant;
  private final NotificationType type;
  private final Platform platform;
  private final Language language;
  private TestRunner(Class<?> klass, Variant variant, NotificationType type,
      Platform platform, Language language) throws InitializationError {
    super(klass);
    this.variant = variant;
    this.type = type;
    this.platform = platform;
    this.language = language;
  }
  public Object createTest() throws Exception {
    return new EmailGenerationTest(variant, type, platform, language);
  }
  protected String getName() {
    return language.name();
  }
  protected String testName(final FrameworkMethod method) {
    return String.format(method.getName() + "[%s-%s-%s-%s]",
        variant.name(), type.name(), platform.name(), language.name());
  }
  protected void validateConstructor(List<Throwable> errors) {
  }
  protected Statement classBlock(RunNotifier notifier) {
    return childrenInvoker(notifier);
  }
}

Note that the name of this runner is the language, it is going to be the inner most runner and the language is going to be used as the inner most list.  The createTest() method is the most important to implement here, it actually instantiates the test class with the right config.  testName() is also very important, it should uniquely identify the test with its config, and it’s what things like maven will display as the name of the test.  Naming it appropiately will allow you to easily see which config the test failed under.

Now I’m going to write my custom runner that I will pass to the @TestRunner annototation, it will build up a tree of nested NamedParentRunner‘s.

public static class EmailGenerationRunner extends Suite {
  public EmailGenerationRunner(Class<?> klass) throws InitializationError {
    super(klass, createChildren(klass));
  }
  private static List<Runner> createChildren(Class<?> klass) throws InitializationError {
    List<Runner> variants = new ArrayList<Runner>();
    for (Variant variant : Variant.values()) {
      List<Runner> types = new ArrayList<Runner>();
      for (NotificationType type : NotificationType.values()) {
        List<Runner> platforms = new ArrayList<Runner>();
        for (Platform platform : Platform.values()) {
          List<Runner> languages = new ArrayList<Runner>();
          for (Language language : Language.values()) {
            languages.add(new TestRunner(klass, variant, type, platform, language));
          }
          platforms.add(new NamedParentRunner(klass, languages, platform.name()));
        }
        types.add(new NamedParentRunner(klass, platforms, type.name()));
      }
      variants.add(new NamedParentRunner(klass, types, variant.name()));
    }
    return variants;
  }
}

This is a fair bit more code than our initial attempt, and is also a lot of code for a single test class. But when you consider that this single test is running hundreds of sub tests that test the core functionality of my appliaction, it’s not so bad. And the results are really quite nice. This is now what it looks like in IDEA, I get a tree of permutations and can click to expand them to see what passed and what failed:Custom suite test run in IntelliJ IDEASo now we’ve seen some quite advanced methods for testing many permutations of configurations over the same code in JUnit. Since implementing this in notificator, I’ve been able to much more confidently make major refactorings of my templates, as well as add new notification types without having to worry about manually checking every platform, language and variant combination. I hope this will help you in the same way.

You can download the above example code from GitHub.

Unit testing Java mail code

GreenMail is a mail testing tool that has been around for a long time, but I’m surprised by how few people actually know about it.  I’ve found it very useful on many occasions for testing my mail sending code, both in integration/acceptance style tests, and also in unit tests.  In this post I’m going to give a short introduction to using GreenMail with JUnit, and then I’ll add a number of advanced tips that I’ve learned from my experience to help you effectively test your code. For my examples I’m going to be testing a class that I’ve written called MailSender.

Using GreenMail with JUnit

GreenMail starts as a server listening on ports for SMTP/POP/IMAP connections, these are naturally handled by background threads. In order to use GreenMail in unit testing, you need to start it up before each test and shut it down again after.  GreenMail is incredibly fast to startup and shutdown so there’s no worries about performance of your tests here.

private static final int SMTP_TEST_PORT = 3025;
private GreenMail greenMail;
private MailSender mailSender;

@Before
public void setUp() throws Exception {
    greenMail = new GreenMail(new ServerSetup(SMTP_TEST_PORT, null, "smtp"));
    greenMail.start();
    mailSender = new MailSender("localhost", SMTP_TEST_PORT);
}
@After
public void tearDown() throws Exception {
    greenMail.stop();
}

Now I’m ready to write my tests.  Here’s an example test:

@Test
public void sendShouldSetTheRightText() throws Exception {
    mailSender.send("Hello World!");
    assertThat((String) greenMail.getReceivedMessages()[0].getContent(),
        equalTo("Hello World!"));
}

The getReceivedMessages() method returns an array of javax.mail.internet.MimeMessage, each of which contains all the mime information from each sent email. Note that in this case, I’m assuming that everything is happening synchronously, which it is. Sometimes though (particularly in integration tests) mail is put onto a queue and sent asynchronously. GreenMail offers a handy method for dealing with this, waitForIncomingMail(), which accepts a timeout and a number of emails you want to wait for, and returns a boolean letting you know whether the mail you were waiting for arrived.

Selecting a good port

The above code works fine if you know that port 3025 is free. But you don’t always know that, particularly if you’re running your tests on a CI server, that CI server may be running another job that is trying to use the same port. I’ve found the following code useful for picking a good port:

private static int findAvailablePort(int min, int max) {
    for (int port = min; port < max; port++) {
        try {
            new ServerSocket(port).close();
            return port;
        } catch (IOException e) {
            // Must already be taken
        }
    }
    throw new IllegalStateException("Could not find available port in range "
            + min + " to " + max);
}

I can now specify a range of ports and my code will choose one that is free.

Checking who the mail was actually sent to

As you know, mime headers can often lie. Just like the header on a letter might be different to the name on the envelope, the To header set in a mime header might be different to that sent to the SMTP server. This is also known as bccing, and it may be the case that your code uses this. If that is the case, you can’t run assertions on the To field to ensure that the email was actually sent to the right person (although, due to this bug in GreenMail you actually can, but you can’t know whether they weren’t added to the To header by your code). However, GreenMail provides a way to get mail that was actually delivered to a particular user. The first thing you need to do is get a reference to the object for that user, this is most easily done by setting their password, I do this in the setUp() method:

greenMailUser = greenMail.setUser("recipient@example.org", null);

Now when it comes to writing my tests, I can look up that users inbox and find the messages that are in it:

MailFolder inbox = greenMail.getManagers().getImapHostManager().getInbox(greenMailUser);
List<StoredMessage> messages = inbox.getMessages();
if (!messages.isEmpty()) {
    messages.get(0).getMimeMessage();
    ...
} else {
    fail("No email for user arrived")
}

Checking the envelope from address

A similar issue to bccing is telling the SMTP server that you’re sending from a different email address to the email address you put in the mime headers. This is also known as the envelope from or bounce address, and it’s typically used for error reporting, for example, to tell the sender that the mailbox doesn’t exist. This can be set in Java by creating an instance of com.sun.mail.smtp.SMTPMessage instead of MimeMessage and setting it using the setEnvelopeFrom() method. This is not a mime header that we can just check, however SMTP servers are required to put this address into the email in the mime header Return-Path. GreenMail does this to. So to assert that the correct envelope from address was set:

String returnPathHeader = message.getHeader("Return-Path", ",");
assertThat(returnPathHeader, notNullValue());
InternetAddress returnPath = InternetAddress.parse(returnPathHeader)[0];
assertThat(returnPath.getAddress(), equalTo("bounce-notify@example.org"));

That concludes my post on using GreenMail, if you have any other helpful tips for using GreenMail, please share them as comments here!