samedi 4 août 2018

Use of Scala classes and objects in spark development

I wanted to know what is the recommended way to create project structure when working with scala spark and intelliJ. Is there any design pattern available to create spark projects? In my case I have created a spark driver by putting into Scala object however I have to seggrigate my spark code as per it's functionality for example my utility code should goes to util package my hive functionality code should goes to different package. How we should do this? Should we create package and inside package should we create object and inside object there will be functions to perform my tasks? I am not sure what should be the project structure looks like.

Aucun commentaire:

Enregistrer un commentaire