Jackson Crypto Module fails to decrypt data during deserialization in Temporal workflows

I’m working with Temporal workflows for state management and need to store encrypted data instead of plain text. I decided to use Jackson Json Crypto module for this purpose.

The encryption part works perfectly and my data gets stored in encrypted format. However, I’m facing issues during deserialization where the encrypted data remains encrypted instead of being converted back to the original values.

I think the @Encrypt annotation on my field is not being processed correctly. When I debugged using decompiled library code, I noticed that the deserializer module constructors are called initially, but the @Override deserialize method never gets triggered. The encrypted JSON stays as-is and gets mapped directly to the field.

Can someone explain how these annotations work during serialization and deserialization? Does the @Encrypt annotation get processed before or after the object mapper maps data to the POJO?

Here’s my setup:

// Setting up crypto module with object mapper
CryptoService cryptoService = new CryptoService(objectMapper, new PasswordCryptoContext("SecretKey123"));
objectMapper.registerModule(new CryptoModule().addEncryptionService(cryptoService));

// Field with encryption annotation
@JsonProperty("user_id")
@Encrypt
private String userId;

When data gets encrypted:

"user_id": {
  "salt": "Mz91XhtANF85Dp7wRb9EK8tQfJc=",
  "iv": "Y72bkmfqvHk3SGDTJ+zy/g==",
  "value": "M7VEVKkFp1j8KG1cH3IFjHdOmNKdIxoLIc1+gjNqnEr3AYRrGgwKaaS7F9f6n8DI"
}

Expected after decryption: "user_id": "xyz123"

Actual result: Same encrypted object structure

I encountered a similar issue with the Jackson Crypto module when integrating it into Temporal workflows. The core challenge stems from the fact that Temporal’s serializers do not recognize the custom annotations needed for processing encrypted data. To resolve this, you’ll need to implement a custom JsonDataConverter that utilizes your ObjectMapper with the registered Crypto module. This will ensure that both serialization and deserialization adhere to your configured settings.

To do this, create an instance of your custom converter and set it within the Temporal client like so:

JsonDataConverter customConverter = new JsonDataConverter(objectMapper);
DefaultDataConverter dataConverter = DefaultDataConverter.newDefaultInstance().withPayloadConverterOverrides(customConverter);

Make sure to apply this converter in your worker and client configuration. This approach guarantees that your annotations, including the @Encrypt annotation, are effectively processed during serialization and deserialization.

This looks like a classic case where Temporal’s default data converter isn’t picking up your custom Jackson configuration. I ran into this exact same thing about 6 months ago on a payment processing workflow.

Temporal uses its own ObjectMapper instance internally and completely ignores your configured one with the crypto module. Your @Encrypt annotations work fine during regular app serialization but get bypassed when Temporal handles workflow data.

Here’s what fixed it for me:

// Create your crypto-enabled ObjectMapper
ObjectMapper cryptoMapper = new ObjectMapper();
CryptoService cryptoService = new CryptoService(cryptoMapper, new PasswordCryptoContext("SecretKey123"));
cryptoMapper.registerModule(new CryptoModule().addEncryptionService(cryptoService));

// Build custom data converter
JsonPayloadConverter jsonConverter = new JsonPayloadConverter(cryptoMapper);
DefaultDataConverter dataConverter = DefaultDataConverter.newDefaultInstance()
    .withPayloadConverterOverrides(jsonConverter);

// Apply to both client and worker
WorkflowClient client = WorkflowClient.newInstance(service, 
    WorkflowClientOptions.newBuilder()
        .setDataConverter(dataConverter)
        .build());

The key difference from the other answer is using JsonPayloadConverter instead of JsonDataConverter - that’s what actually hooks into Temporal’s serialization pipeline.

There’s a great live demo that covers exactly this kind of custom serialization setup:

Make sure you apply the same data converter to both your client and worker configs, otherwise you’ll get deserialization errors when the workflow resumes from history.