Spark components were delivered as Technical Preview in v5.6 but couldn't leverage the full Studio big data capabilities such as turning a standard Job into an M/R Job then into a Streaming Job.
The effortless migration of a M/R jobs to Spark was one of the key requirements in our 6.0 (subscription) roadmap.
However the development of this capability involved an overal change of the way we handled Spark so that it becomes natively part of the Code generator.
The maintenance issue made it impossible for us to keep two different sets of components generating Spark (given all links to Hadoop distros and versions) and thus we had to drop the initial tSpark** components.
Please get more details on our Talend 6 "What's New" page:http://www.talend.com/products/talend-6
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Part 2 of a series on Context Variables
Learn how to do cool things with Context Variables
Find out how to migrate from one database to another using the Dynamic schema