提问者:小点点

HierarchyStruc类中Java中的ClassCastException,如下所示


我正在Java获取ClassCastException,但无法解析。 有人能告诉我我错过了什么吗?

这是我的课:

private List<Record> getGenericRecordsforHierarchy(int hID, Schema schema,List<HierarchyStruc> hs, int maxLevel) throws BookHierarchyException {        
        List<GenericData.Record> recordList = new ArrayList<GenericData.Record>();
        GenericData.Record record = new GenericData.Record(schema);
        for (int i = 0; i < hs.size(); i++) {
            record = new GenericData.Record(schema);
            record.put("HierarchyId", hID);
            for (int j = 0; j <= (maxLevel*3) && j < ((Constants.MAX_ALLOWED)*3); j++) {
                int k=0;
                record.put("Level" + (k+1) + "Id", hs.get(j));
                record.put("Level" + (k+1) + "Desc", hs.get(j+1) );
                record.put("Level" + (k+1) + "nodeId", hs.get(j+2) );
                j= j+2;
                k++;
                if (j + 1 > (maxLevel*3) || null == hs.get(j+1)) {                  
                    record.put("parentNodeId", hs.get(i).getParentNodeId());
                    record.put("BookName", hs.get(i).getBookName());
                    record.put("HierarchyName", hs.get(i).getHierarchyName());
                    record.put("NodeDesc", hs.get(i).getNodeDesc());
                    break;
                }
            }
            recordList.add(record);
        }
        return recordList;
    }

我的Generic.Record类如下所示:

public static class Record implements GenericRecord, Comparable<Record> {
    private final Schema schema;
    private final Object[] values;
    public Record(Schema schema) {
      if (schema == null || !Type.RECORD.equals(schema.getType()))
        throw new AvroRuntimeException("Not a record schema: "+schema);
      this.schema = schema;
      this.values = new Object[schema.getFields().size()];
    }
    public Record(Record other, boolean deepCopy) {
      schema = other.schema;
      values = new Object[schema.getFields().size()];
      if (deepCopy) {
        for (int ii = 0; ii < values.length; ii++) {
          values[ii] = INSTANCE.deepCopy(
              schema.getFields().get(ii).schema(), other.values[ii]);
        }
      }
      else {
        System.arraycopy(other.values, 0, values, 0, other.values.length);
      }
    }
    @Override public Schema getSchema() { return schema; }
    @Override public void put(String key, Object value) {
      Schema.Field field = schema.getField(key);
      if (field == null)
        throw new AvroRuntimeException("Not a valid schema field: "+key);

      values[field.pos()] = value;
    }
    @Override public void put(int i, Object v) { values[i] = v; }
    @Override public Object get(String key) {
      Field field = schema.getField(key);
      if (field == null) return null;
      return values[field.pos()];
    }
    @Override public Object get(int i) { return values[i]; }
    @Override public boolean equals(Object o) {
      if (o == this) return true;                 // identical object
      if (!(o instanceof Record)) return false;   // not a record
      Record that = (Record)o;
      if (!this.schema.equals(that.schema))
        return false;                             // not the same schema
      return GenericData.get().compare(this, that, schema, true) == 0;
    }
    @Override public int hashCode() {
      return GenericData.get().hashCode(this, schema);
    }
    @Override public int compareTo(Record that) {
      return GenericData.get().compare(this, that, schema);
    }
    @Override public String toString() {
      return GenericData.get().toString(this);
    }
  }

在以下方法中获取错误:

private void writeToParquet(String hadoopPath, List<Record> recordList, Schema schema)
            throws BookHierarchyException {
        org.apache.hadoop.fs.Path path = new org.apache.hadoop.fs.Path(hadoopPath);
        ParquetWriter<GenericData.Record> writer = null;

        Configuration configuration = new Configuration(false);
        configuration.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());

        try {
            writer = AvroParquetWriter.<GenericData.Record>builder(path)
                    .withRowGroupSize(ParquetWriter.DEFAULT_BLOCK_SIZE).withPageSize(ParquetWriter.DEFAULT_PAGE_SIZE)
                    .withSchema(schema).withConf(new Configuration()).withCompressionCodec(CompressionCodecName.SNAPPY)
                    .withValidation(false).withDictionaryEncoding(false).build();

            for (GenericData.Record record : recordList) {
                writer.write(record);
            }
            log.info("File writing done. Closing file");
            writer.close();
        } catch (IOException e) {
            log.error(e);
            throw new BookHierarchyException("Error in file handling");
        }
    }

错误为:

java.lang.ClassCastException: HierarchyStruc cannot be cast to java.lang.CharSequence

有人能告诉我我在这里做错了什么吗? 我匹配了HierarchyStruc的模式类型和列类型,但它们也是匹配的。 这似乎是一个不同的问题,我从早上开始就试图解决这个问题。


共1个答案

匿名用户

很难从您的代码中分辨出来,因为它是不完整的。 您提供了一个函数GetGenericRecordsForHierarchy,但没有在任何地方调用它。 您还没有为ClassCastException提供堆栈跟踪,因此我们甚至不知道它在哪里发生。

这就是说,这是可疑的:

record.put("Level" + (k+1) + "Id", hs.get(j));
record.put("Level" + (k+1) + "Desc", hs.get(j+1) );
record.put("Level" + (k+1) + "nodeId", hs.get(j+2) );

在这里,您将整个HierarchyStruc实例添加到GenericData.Record中,但键表明它们应该是字符串。 如果您将记录传递给要求字段为字符串的对象,那么它将引发您所看到的异常。

如果您可以提供堆栈跟踪和出现在跟踪中的类的源代码(如果可用),那么我可以给您一个更好的答案。