Skip to main content

Status Codes

About 4 min

Status Codes

A sample solution as IoTDB requires registering the time series first before writing data is:

try {
    writeData();
} catch (SQLException e) {
  // the most case is that the time series does not exist
  if (e.getMessage().contains("exist")) {
      //However, using the content of the error message is not so efficient
      registerTimeSeries();
      //write data once again
      writeData();
  }
}

With Status Code, instead of writing codes like if (e.getErrorMessage().contains("exist")), we can simply use e.getErrorCode() == TSStatusCode.TIME_SERIES_NOT_EXIST_ERROR.getStatusCode().

Here is a list of Status Code and related message:

Status CodeStatus TypeMeanings
200SUCCESS_STATUS
201INCOMPATIBLE_VERSIONIncompatible version
202CONFIGURATION_ERRORConfiguration error
203START_UP_ERRORMeet error while starting
204SHUT_DOWN_ERRORMeet error while shutdown
300UNSUPPORTED_OPERATIONUnsupported operation
301EXECUTE_STATEMENT_ERRORExecute statement error
302MULTIPLE_ERRORMeet error when executing multiple statements
303ILLEGAL_PARAMETERParameter is illegal
304OVERLAP_WITH_EXISTING_TASKCurrent task has some conflict with existing tasks
305INTERNAL_SERVER_ERRORInternal server error
306DISPATCH_ERRORMeet error while dispatching
400REDIRECTION_RECOMMENDRecommend Client redirection
500DATABASE_NOT_EXISTDatabase does not exist
501DATABASE_ALREADY_EXISTSDatabase already exist
502SERIES_OVERFLOWSeries number exceeds the threshold
503TIMESERIES_ALREADY_EXISTTimeseries already exists
504TIMESERIES_IN_BLACK_LISTTimeseries is being deleted
505ALIAS_ALREADY_EXISTAlias already exists
506PATH_ALREADY_EXISTPath already exists
507METADATA_ERRORMeet error when dealing with metadata
508PATH_NOT_EXISTPath does not exist
509ILLEGAL_PATHIllegal path
510CREATE_TEMPLATE_ERRORCreate schema template error
511DUPLICATED_TEMPLATESchema template is duplicated
512UNDEFINED_TEMPLATESchema template is not defined
513TEMPLATE_NOT_SETSchema template is not set
514DIFFERENT_TEMPLATETemplate is not consistent
515TEMPLATE_IS_IN_USETemplate is in use
516TEMPLATE_INCOMPATIBLETemplate is not compatible
517SEGMENT_NOT_FOUNDSegment not found
518PAGE_OUT_OF_SPACENo enough space on schema page
519RECORD_DUPLICATEDRecord is duplicated
520SEGMENT_OUT_OF_SPACENo enough space on schema segment
521PBTREE_FILE_NOT_EXISTSPBTreeFile does not exist
522OVERSIZE_RECORDSize of record exceeds the threshold of page of PBTreeFile
523PBTREE_FILE_REDO_LOG_BROKENPBTreeFile redo log has broken
524TEMPLATE_NOT_ACTIVATEDSchema template is not activated
526SCHEMA_QUOTA_EXCEEDEDSchema usage exceeds quota limit
527MEASUREMENT_ALREADY_EXISTS_IN_TEMPLATEMeasurement already exists in schema template
600SYSTEM_READ_ONLYIoTDB system is read only
601STORAGE_ENGINE_ERRORStorage engine related error
602STORAGE_ENGINE_NOT_READYThe storage engine is in recovery, not ready fore accepting read/write operation
603DATAREGION_PROCESS_ERRORDataRegion related error
604TSFILE_PROCESSOR_ERRORTsFile processor related error
605WRITE_PROCESS_ERRORWriting data related error
606WRITE_PROCESS_REJECTWriting data rejected error
607OUT_OF_TTLInsertion time is less than TTL time bound
608COMPACTION_ERRORMeet error while merging
609ALIGNED_TIMESERIES_ERRORMeet error in aligned timeseries
610WAL_ERRORWAL error
611DISK_SPACE_INSUFFICIENTDisk space is insufficient
700SQL_PARSE_ERRORMeet error while parsing SQL
701SEMANTIC_ERRORSQL semantic error
702GENERATE_TIME_ZONE_ERRORMeet error while generating time zone
703SET_TIME_ZONE_ERRORMeet error while setting time zone
704QUERY_NOT_ALLOWEDQuery statements are not allowed error
705LOGICAL_OPERATOR_ERRORLogical operator related error
706LOGICAL_OPTIMIZE_ERRORLogical optimize related error
707UNSUPPORTED_FILL_TYPEUnsupported fill type related error
708QUERY_PROCESS_ERRORQuery process related error
709MPP_MEMORY_NOT_ENOUGHNot enough memory for task execution in MPP
710CLOSE_OPERATION_ERRORMeet error in close operation
711TSBLOCK_SERIALIZE_ERRORTsBlock serialization error
712INTERNAL_REQUEST_TIME_OUTMPP Operation timeout
713INTERNAL_REQUEST_RETRY_ERRORInternal operation retry failed
714NO_SUCH_QUERYCannot find target query
715QUERY_WAS_KILLEDQuery was killed when execute
800UNINITIALIZED_AUTH_ERRORFailed to initialize auth module
801WRONG_LOGIN_PASSWORDUsername or password is wrong
802NOT_LOGINNot login
803NO_PERMISSIONNo permisstion to operate
804USER_NOT_EXISTUser not exists
805USER_ALREADY_EXISTUser already exists
806USER_ALREADY_HAS_ROLEUser already has target role
807USER_NOT_HAS_ROLEUser not has target role
808ROLE_NOT_EXISTRole not exists
809ROLE_ALREADY_EXISTRole already exists
810ALREADY_HAS_PRIVILEGEAlready has privilege
811NOT_HAS_PRIVILEGENot has privilege
812CLEAR_PERMISSION_CACHE_ERRORFailed to clear permission cache
813UNKNOWN_AUTH_PRIVILEGEUnknown auth privilege
814UNSUPPORTED_AUTH_OPERATIONUnsupported auth operation
815AUTH_IO_EXCEPTIONIO Exception in auth module
900MIGRATE_REGION_ERRORError when migrate region
901CREATE_REGION_ERRORCreate region error
902DELETE_REGION_ERRORDelete region error
903PARTITION_CACHE_UPDATE_ERRORUpdate partition cache failed
904CONSENSUS_NOT_INITIALIZEDConsensus is not initialized and cannot provide service
905REGION_LEADER_CHANGE_ERRORRegion leader migration failed
906NO_AVAILABLE_REGION_GROUPCannot find an available region group
907LACK_DATA_PARTITION_ALLOCATIONLacked some data partition allocation result in the response
1000DATANODE_ALREADY_REGISTEREDDataNode already registered in cluster
1001NO_ENOUGH_DATANODEThe number of DataNode is not enough, cannot remove DataNode or create enough replication
1002ADD_CONFIGNODE_ERRORAdd ConfigNode error
1003REMOVE_CONFIGNODE_ERRORRemove ConfigNode error
1004DATANODE_NOT_EXISTDataNode not exist error
1005DATANODE_STOP_ERRORDataNode stop error
1006REMOVE_DATANODE_ERRORRemove datanode failed
1007REGISTER_DATANODE_WITH_WRONG_IDThe DataNode to be registered has incorrect register id
1008CAN_NOT_CONNECT_DATANODECan not connect to DataNode
1100LOAD_FILE_ERRORMeet error while loading file
1101LOAD_PIECE_OF_TSFILE_ERRORError when load a piece of TsFile when loading
1102DESERIALIZE_PIECE_OF_TSFILE_ERRORError when deserialize a piece of TsFile
1103SYNC_CONNECTION_ERRORSync connection error
1104SYNC_FILE_REDIRECTION_ERRORSync TsFile redirection error
1105SYNC_FILE_ERRORSync TsFile error
1106CREATE_PIPE_SINK_ERRORFailed to create a PIPE sink
1107PIPE_ERRORPIPE error
1108PIPESERVER_ERRORPIPE server error
1109VERIFY_METADATA_ERRORMeet error in validate timeseries schema
1200UDF_LOAD_CLASS_ERRORError when loading UDF class
1201UDF_DOWNLOAD_ERRORDataNode cannot download UDF from ConfigNode
1202CREATE_UDF_ON_DATANODE_ERRORError when create UDF on DataNode
1203DROP_UDF_ON_DATANODE_ERRORError when drop a UDF on DataNode
1300CREATE_TRIGGER_ERRORConfigNode create trigger error
1301DROP_TRIGGER_ERRORConfigNode delete Trigger error
1302TRIGGER_FIRE_ERRORError when firing trigger
1303TRIGGER_LOAD_CLASS_ERRORError when load class of trigger
1304TRIGGER_DOWNLOAD_ERRORError when download trigger from ConfigNode
1305CREATE_TRIGGER_INSTANCE_ERRORError when create trigger instance
1306ACTIVE_TRIGGER_INSTANCE_ERRORError when activate trigger instance
1307DROP_TRIGGER_INSTANCE_ERRORError when drop trigger instance
1308UPDATE_TRIGGER_LOCATION_ERRORError when move stateful trigger to new datanode
1400NO_SUCH_CQCQ task does not exist
1401CQ_ALREADY_ACTIVECQ is already active
1402CQ_AlREADY_EXISTCQ is already exist
1403CQ_UPDATE_LAST_EXEC_TIME_ERRORCQ update last execution time failed

All exceptions are refactored in the latest version by extracting uniform message into exception classes. Different error codes are added to all exceptions. When an exception is caught and a higher-level exception is thrown, the error code will keep and pass so that users will know the detailed error reason.
A base exception class "ProcessException" is also added to be extended by all exceptions.

Copyright © 2024 The Apache Software Foundation.
Apache and the Apache feather logo are trademarks of The Apache Software Foundation

Have a question? Connect with us on QQ, WeChat, or Slack. Join the community now.

We use Google Analytics to collect anonymous, aggregated usage information.