DataStage job with lookup aborts with error File too large to fit

DataStage job with lookup aborts with error File too large to fit

This is one more error I encountered today – and looks weird to me as I faced this first time

A DataStage job that contains a lookup fails with an error that is similar to the following error: Lookup_Tcodes,0: Error writing table file “/dsData/Datasets/lookuptable.20100217.rtesd”: File too large

To Resolve this

  1. Create a hash partition on the lookup data
  2. Change the parallel configuration file to two or more nodes to break the lookup data into smaller pieces.

  • Except Hash partition – all other partitions give erroneous results as we are breaking data into smaller pieces

  • Change the configuration file at jobs – if you are unsure about impact on other jobs

Please feel free to leave comments or suggestions :)

 

Author: Kuntamukkala Ravi

ETL Consultant by Profession, Webmaster by Passion

Leave a Reply

Your email address will not be published. Required fields are marked *