How to structure tests for a Hexagonal Architecture — unit-testing the domain in isolation, integration-testing adapters with Testcontainers, and keeping the test pyramid honest.
Hexagonal Architecture’s central promise is testability. If you’ve done it right, your domain logic lives inside the hexagon with no framework dependencies, no Spring annotations, no database calls. It’s plain Java. And plain Java is trivially testable without a container, without mocks of mocks, and without Testcontainers spinning up infrastructure you don’t actually need for that particular test.
But the promise only holds if you test each layer at the right boundary. The mistake most teams make is either testing too much at the integration level — every test hits the database, every test starts the full context — or testing too little at the domain level, reaching straight for the application service rather than the domain model. Both produce suites that are slow, brittle, or miss the things that actually break.
This post is about getting the layering right.
In a Hexagonal Architecture, the testing breaks down naturally into three zones:
Domain — pure business logic, no dependencies on infrastructure or frameworks. Tested with plain JUnit, no mocks required unless the domain itself depends on other domain services.
Application — use case orchestration, driving ports. Depends on the domain and on driving/driven port interfaces. Tested with mocked driven ports (repositories, messaging gateways) — no real infrastructure.
Adapters — the wiring between the hexagon and the outside world (HTTP, database, Kafka, external APIs). Tested with real infrastructure where possible, using Testcontainers.
The domain layer should have no Spring, no JPA, no Jackson. If your domain model has @Entity annotations on it, the hexagon is already leaking.
Given a domain aggregate:
public class BettingMarket {
private final MarketId id;
private MarketStatus status;
private final List<Runner> runners;
public void suspend() {
if (status == MarketStatus.CLOSED) {
throw new IllegalStateException("Cannot suspend a closed market");
}
this.status = MarketStatus.SUSPENDED;
}
public Optional<Runner> findRunner(RunnerId runnerId) {
return runners.stream()
.filter(r -> r.id().equals(runnerId))
.findFirst();
}
}
The test is pure JUnit:
class BettingMarketTest {
@Test
void suspend_throws_when_market_is_closed() {
var market = BettingMarketFixtures.closedMarket();
assertThatThrownBy(market::suspend)
.isInstanceOf(IllegalStateException.class)
.hasMessageContaining("closed");
}
@Test
void findRunner_returns_empty_for_unknown_runner() {
var market = BettingMarketFixtures.activeMarket();
assertThat(market.findRunner(RunnerId.of("unknown"))).isEmpty();
}
}
No mocks. No annotations. No container. Runs in milliseconds. This is the foundation of the test pyramid — many of these, fast and precise.
The application layer orchestrates use cases through driven port interfaces — MarketRepository, EventPublisher, and so on. When testing a use case, mock the driven ports; exercise real domain logic.
interface MarketRepository {
Optional<BettingMarket> findById(MarketId id);
void save(BettingMarket market);
}
@Service
class SuspendMarketUseCase {
private final MarketRepository repository;
private final MarketEventPublisher eventPublisher;
public void suspend(MarketId marketId) {
var market = repository.findById(marketId)
.orElseThrow(() -> new MarketNotFoundException(marketId));
market.suspend();
repository.save(market);
eventPublisher.publish(new MarketSuspendedEvent(marketId));
}
}
The test:
class SuspendMarketUseCaseTest {
private final MarketRepository repository = mock(MarketRepository.class);
private final MarketEventPublisher eventPublisher = mock(MarketEventPublisher.class);
private final SuspendMarketUseCase useCase =
new SuspendMarketUseCase(repository, eventPublisher);
@Test
void publishes_event_when_market_is_suspended() {
var market = BettingMarketFixtures.activeMarket();
when(repository.findById(market.id())).thenReturn(Optional.of(market));
useCase.suspend(market.id());
verify(eventPublisher).publish(new MarketSuspendedEvent(market.id()));
}
@Test
void throws_when_market_not_found() {
when(repository.findById(any())).thenReturn(Optional.empty());
assertThatThrownBy(() -> useCase.suspend(MarketId.of("unknown")))
.isInstanceOf(MarketNotFoundException.class);
}
}
No Spring context. No database. Mockito mocks the ports, but the domain logic — market.suspend() — runs for real. If there’s a bug in the domain rule, this test finds it.
Do not mock the domain model in use case tests. If you find yourself doing when(market.suspend()).thenReturn(...), the test is no longer exercising anything meaningful.
The persistence adapter implements MarketRepository against a real database. This is where Testcontainers earns its keep.
@DataJpaTest
@AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE)
@Testcontainers
class MarketPersistenceAdapterTest {
@Container
static PostgreSQLContainer<?> postgres = new PostgreSQLContainer<>("postgres:16");
@DynamicPropertySource
static void properties(DynamicPropertyRegistry registry) {
registry.add("spring.datasource.url", postgres::getJdbcUrl);
registry.add("spring.datasource.username", postgres::getUsername);
registry.add("spring.datasource.password", postgres::getPassword);
}
@Autowired
private MarketPersistenceAdapter adapter;
@Test
void persists_and_retrieves_market() {
var market = BettingMarketFixtures.activeMarket();
adapter.save(market);
var retrieved = adapter.findById(market.id());
assertThat(retrieved).isPresent();
assertThat(retrieved.get().status()).isEqualTo(MarketStatus.ACTIVE);
}
}
This test starts a real PostgreSQL container, runs Flyway migrations, and exercises the full persistence path. It’s slower — but it tests the one thing only a real database can test: whether your JPA mapping, SQL, and transaction boundaries actually work.
Keep adapter tests narrow. Test the adapter against its port contract — not the use case, not the domain. The domain and use case tests already cover that.
The web adapter — the REST controller — should be tested with @WebMvcTest, which starts only the web layer without a full application context:
@WebMvcTest(MarketController.class)
class MarketControllerTest {
@Autowired
private MockMvc mockMvc;
@MockBean
private SuspendMarketUseCase suspendMarketUseCase;
@Test
void returns_200_on_suspend() throws Exception {
mockMvc.perform(post("/markets/{id}/suspend", "1.234567"))
.andExpect(status().isOk());
verify(suspendMarketUseCase).suspend(MarketId.of("1.234567"));
}
@Test
void returns_404_when_market_not_found() throws Exception {
doThrow(new MarketNotFoundException(MarketId.of("unknown")))
.when(suspendMarketUseCase).suspend(any());
mockMvc.perform(post("/markets/{id}/suspend", "unknown"))
.andExpect(status().isNotFound());
}
}
The @MockBean replaces the real use case — this test is about the HTTP binding (request mapping, status codes, error handling), not about what happens inside the use case. Fast, focused, honest.
If you follow this structure, your pyramid holds naturally:
@SpringBootTest, only for smoke-testing the wiringThe failure mode to avoid is inverting this — testing everything at @SpringBootTest level because it’s “easier”. It is easier in the short term. In the long term you get a test suite that takes ten minutes to run, gives you no useful failure information when a domain rule breaks, and actively discourages the red-green-refactor loop that makes tests valuable.
If you’re working on a Spring Boot service and want to review how the test architecture maps to your domain, get in touch.