

Pentaho Data Integration (PDI). The actual Hive JDBC implementation for the specific distribution and version of Hadoop is located in the Pentaho Configuration (shim) for that. You will work with amazon redshift and pentaho data integration platform (Kettle), therefore gaining opportunities to learn some of the most innovative.1) read message from Kafka using "Kafka Consumer" pluginShipped with Pentaho products: Pentaho Server Pentaho Data Integration Pentaho Metadata Editor Pentaho Report Designer Comments: The pentaho-hadoop-hive-jdbc-shim-xxx.jar library is a proxy driver. Customers can now take advantage of both Redshift's automation of labor-intensive tasks such as setting up, operating and creating a data warehouse cluster and the power of Pentaho's big data analytics platform to cost-effectively improve. For context, the problem step is shown below.Pentaho has certified its business analytics and data integration platform to work with Amazon Redshift. I have a Pentaho Transformation whereby I cannot serialize my fields using Avro ready for Kafka Producer consumption.
...

I installed it and now have a working Marketplace, but I have two Marketplace items on the menu now and one of them still shows up blank. I read several notes on installing MarketPlace 5.4 in 6.1 to make Marketplace work. I am pretty new to Pentaho so any help would be appreciated.After I installed Pentaho CE 6.1 the Marketplace is empty.
