scala - get hdfs file path in spark -


i wonder whether have elegant way directory under path. example, have path on hfs /a/b/c/d/e/f, , given a/b/c, there straight-forward way path /a/b/c/d/e ? think can of regex. still hope find whether there easier way make code cleaner. evn: spark 1.6, language: scala

for several days investigation, think there may no such easy way(integrated api)to extract work. write regex pattern wise choice. welcome suggestion.


Comments

Popular posts from this blog

jOOQ update returning clause with Oracle -

java - Warning equals/hashCode on @Data annotation lombok with inheritance -

java - BasicPathUsageException: Cannot join to attribute of basic type -