When building a Custom Domain Table (CDT) in Amperity, customers may encounter spec failures or unhandled service errors while starting a Spark session or testing SQL queries. These errors often occur due to unsupported column types or improper query syntax.
Problem
When creating a Custom Domain Table (CDT) or starting a Spark session, you may encounter an “Unhandled service error: Spec failure”. This typically occurs during schema validation and prevents the Spark session from starting or the CDT from being created.
Unhandled service error: Spec failure: clojure.spec.alpha$map_spec_impl$reify__1984@6cc1f857 [] - failed: (<= 1 (count %) MAX_VALUE) in: [:amperity.query.mount/tables] at: [:amperity.query.mount/tables] spec: :amperity.query.mount/tables
Additionally, an on-screen error message may appear:
Error at line 0- Complex type: Array not supported on columns: items
These errors prevents the Spark session from starting and blocks use of CDT features like Identify PK.
Cause
This error most commonly happens when the CDT query uses SELECT * and includes unsupported or complex data types, such as arrays or nested structures. Custom Domain Tables require explicitly defined, supported column types, and complex fields cannot be validated automatically.
Resolution
Avoid using
SELECT *when defining a CDT.Explicitly select only the required columns.
Exclude columns that contain complex data types (e.g., arrays or nested objects).
If needed, transform or flatten complex fields before including them in the CDT.
Best Practices
Always define CDT schemas explicitly.
Review source data types before creating a CDT.
Use Spark sessions with queries that only include supported column types.
Following these steps should allow the Spark session to start successfully and the CDT to be created without errors.