spark遇到的错误1-内存不足

时间:2015-12-02 20:20:53   收藏:0   阅读:342

 

原来的代码:

 JavaRDD<ArticleReply> javaRdd = rdd.flatMap(new FlatMapFunction<String, ArticleReply>() {
            private static final long serialVersionUID = 10000L;
            List<ArticleReply> newList = new ArrayList<ArticleReply>();
            public Iterable<ArticleReply> call(String line) throws Exception {
                String[] splits = line.split("\t");
                ArticleReply bean = new ArticleReply();
                bean.setAreaId(split[0]);
                bean.setAgent(Integer.parseInt(splits[1]));
                bean.setSerial(splits[2]);
                newList.add(bean);
                return newList;
            }
        });

 

正确写法:

 JavaRDD<ArticleReply> javaRdd = rdd.flatMap(new FlatMapFunction<String, ArticleReply>() {
            private static final long serialVersionUID = 10000L;
            
            public Iterable<ArticleReply> call(String line) throws Exception {
         List<ArticleReply> newList = new ArrayList<ArticleReply>(); String[] splits = line.split("\t"); ArticleReply bean = new ArticleReply(); bean.setAreaId(split[0]); bean.setAgent(Integer.parseInt(splits[1])); bean.setSerial(splits[2]); newList.add(bean); return newList; } });

错误的写法中把list声明和初始化在flatMap函数之外,造成每次调用flatMap函数后,list的bean会增加一个,同时程序会将改list返还回去,那么spark接收的对象1+2+3+...+N个,

而不是N个,会极大地消耗spark的内存,造成spark运行内存不足。

原文:http://www.cnblogs.com/fillPv/p/5013732.html

评论(0
© 2014 bubuko.com 版权所有 - 联系我们:wmxa8@hotmail.com
打开技术之扣,分享程序人生!