Event Debug Server Status

Version: 5.14.3 (#4 built by jenkins on 20180414-0409 git: 7f2725eea4edeb1d184a2888db790d332167b6f8)
Time now: October 14, 2024 6:24:32 AM CST

 |  Configure Logging  |  Threads  |  Thread CPU Time  |  Poor Man's Profiler  |  JMX  |  GC Pause Information


EventCatcherService (com.cloudera.cmf.eventcatcher.server.EventCatcherService)

There are 5000008 documents in the index


Lucene index manager (com.cloudera.cmf.eventcatcher.server.SingleIndexManager)

There are 5000008 events in the index with 526774 deleted events waiting to be expunged.
Last searcher refresh occurred 4.09 secs ago and took 0.001 secs.
Last commit occurred 16.44 secs ago and took 0.044 secs.
Last cleanup occurred 146.06 secs ago and took 2.089 secs.


Event Ingester (com.cloudera.cmf.eventcatcher.server.EventIngester)

Seen 42414565 events
Last events:
  • AvroEventWrapper{attributes={ROLE_TYPE=[NAMENODE], CATEGORY=[LOG_MESSAGE], ROLE=[hdfs-NAMENODE-0ab02a2e1651ccf55f7d96cc4bffc3a4], SEVERITY=[IMPORTANT], SERVICE=[hdfs], HOST_IDS=[4ebb60e6-1f25-4ea4-a06b-b10769b4ffda], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOSTS=[mf-42.com], EVENTCODE=[EV_LOG_EVENT]}, content=PriviledgedActionException as:flume (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240926/23/mobile/click":root:flume:drwxr-xr-x, timestamp=1728857820611}
  • AvroEventWrapper{attributes={STACKTRACE=[org.apache.flume.ChannelFullException: Space for commit to queue couldn't be acquired. Sinks are likely not keeping up with sources, or the buffer size is too tight
    	at org.apache.flume.channel.MemoryChannel$MemoryTransaction.doCommit(MemoryChannel.java:128)
    	at org.apache.flume.channel.BasicTransactionSemantics.commit(BasicTransactionSemantics.java:151)
    	at org.apache.flume.channel.ChannelProcessor.processEventBatch(ChannelProcessor.java:194)
    	at org.apache.flume.source.AvroSource.appendBatch(AvroSource.java:402)
    	at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:606)
    	at org.apache.avro.ipc.specific.SpecificResponder.respond(SpecificResponder.java:91)
    	at org.apache.avro.ipc.Responder.respond(Responder.java:151)
    	at org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.messageReceived(NettyServer.java:188)
    	at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
    	at org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:173)
    	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
    	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
    	at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)
    	at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
    	at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)
    	at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:310)
    	at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
    	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
    	at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
    	at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)
    	at org.jboss.netty.handler.codec.oneone.OneToOneDecoder.handleUpstream(OneToOneDecoder.java:70)
    	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
    	at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
    	at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)
    	at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)
    	at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
    	at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)
    	at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318)
    	at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
    	at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
    	at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
    	at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    	at java.lang.Thread.run(Thread.java:745)
    ], ROLE_TYPE=[AGENT], EXCEPTION_TYPES=[org.apache.flume.ChannelFullException], CATEGORY=[LOG_MESSAGE], ROLE=[flume-AGENT-78ad94309f1d685dd767c897ac3dcfa8], SEVERITY=[CRITICAL], SERVICE=[flume], HOST_IDS=[14e99d03-e70e-4413-a4d5-eb1304aadceb], SERVICE_TYPE=[FLUME], LOG_LEVEL=[ERROR], HOSTS=[mf-43.com], EVENTCODE=[EV_LOG_EVENT]}, content=Avro source s_otv_click: Unable to process event batch. Exception follows., timestamp=1728857880080}
  • AvroEventWrapper{attributes={ROLE_TYPE=[NAMENODE], CATEGORY=[LOG_MESSAGE], ROLE=[hdfs-NAMENODE-0ab02a2e1651ccf55f7d96cc4bffc3a4], SEVERITY=[IMPORTANT], SERVICE=[hdfs], HOST_IDS=[4ebb60e6-1f25-4ea4-a06b-b10769b4ffda], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOSTS=[mf-42.com], EVENTCODE=[EV_LOG_EVENT]}, content=PriviledgedActionException as:flume (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240921/06/otv/show":root:flume:drwxr-xr-x, timestamp=1728857880164}
  • AvroEventWrapper{attributes={STACKTRACE=[org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240921/06/otv/show":root:flume:drwxr-xr-x
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)
    	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2915)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2833)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2718)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:608)
    	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:115)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:412)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
    
    	at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown Source)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
    	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:2136)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1803)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1727)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:437)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:433)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:433)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:374)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:926)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:907)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:804)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:793)
    	at org.apache.flume.sink.hdfs.HDFSCompressedDataStream.open(HDFSCompressedDataStream.java:94)
    	at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:269)
    	at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:252)
    	at org.apache.flume.sink.hdfs.BucketWriter$9$1.run(BucketWriter.java:701)
    	at org.apache.flume.auth.SimpleAuthenticator.execute(SimpleAuthenticator.java:50)
    	at org.apache.flume.sink.hdfs.BucketWriter$9.call(BucketWriter.java:698)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240921/06/otv/show":root:flume:drwxr-xr-x
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)
    	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2915)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2833)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2718)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:608)
    	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:115)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:412)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
    
    	at org.apache.hadoop.ipc.Client.call(Client.java:1504)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1441)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
    	at com.sun.proxy.$Proxy23.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:311)
    	at sun.reflect.GeneratedMethodAccessor38.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:606)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
    	at com.sun.proxy.$Proxy24.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:2131)
    	... 21 more
    ], ROLE_TYPE=[AGENT], EXCEPTION_TYPES=[org.apache.hadoop.security.AccessControlException, org.apache.hadoop.ipc.RemoteException], CATEGORY=[LOG_MESSAGE], ROLE=[flume-AGENT-905e6956a22a22dd64d23e96e4697449], SEVERITY=[IMPORTANT], SERVICE=[flume], HOST_IDS=[c9efd392-7aec-42ca-a20e-3a3745ce74ee], SERVICE_TYPE=[FLUME], LOG_LEVEL=[WARN], HOSTS=[mf-44.com], EVENTCODE=[EV_LOG_EVENT]}, content=HDFS IO error, timestamp=1728857880169}
  • AvroEventWrapper{attributes={ROLE_TYPE=[NAMENODE], CATEGORY=[LOG_MESSAGE], ROLE=[hdfs-NAMENODE-0ab02a2e1651ccf55f7d96cc4bffc3a4], SEVERITY=[IMPORTANT], SERVICE=[hdfs], HOST_IDS=[4ebb60e6-1f25-4ea4-a06b-b10769b4ffda], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOSTS=[mf-42.com], EVENTCODE=[EV_LOG_EVENT]}, content=PriviledgedActionException as:flume (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240929/10/mobile/show":root:flume:drwxr-xr-x, timestamp=1728857940127}
  • AvroEventWrapper{attributes={ROLE_TYPE=[NAMENODE], CATEGORY=[LOG_MESSAGE], ROLE=[hdfs-NAMENODE-0ab02a2e1651ccf55f7d96cc4bffc3a4], SEVERITY=[IMPORTANT], SERVICE=[hdfs], HOST_IDS=[4ebb60e6-1f25-4ea4-a06b-b10769b4ffda], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOSTS=[mf-42.com], EVENTCODE=[EV_LOG_EVENT]}, content=PriviledgedActionException as:flume (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240929/10/mobile/show":root:flume:drwxr-xr-x, timestamp=1728858000777}
  • AvroEventWrapper{attributes={STACKTRACE=[org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240929/10/mobile/show":root:flume:drwxr-xr-x
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)
    	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2915)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2833)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2718)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:608)
    	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:115)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:412)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
    
    	at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown Source)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
    	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:2136)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1803)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1727)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:437)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:433)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:433)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:374)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:926)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:907)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:804)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:793)
    	at org.apache.flume.sink.hdfs.HDFSCompressedDataStream.open(HDFSCompressedDataStream.java:94)
    	at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:269)
    	at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:252)
    	at org.apache.flume.sink.hdfs.BucketWriter$9$1.run(BucketWriter.java:701)
    	at org.apache.flume.auth.SimpleAuthenticator.execute(SimpleAuthenticator.java:50)
    	at org.apache.flume.sink.hdfs.BucketWriter$9.call(BucketWriter.java:698)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240929/10/mobile/show":root:flume:drwxr-xr-x
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)
    	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2915)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2833)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2718)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:608)
    	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:115)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:412)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
    
    	at org.apache.hadoop.ipc.Client.call(Client.java:1504)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1441)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
    	at com.sun.proxy.$Proxy23.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:311)
    	at sun.reflect.GeneratedMethodAccessor38.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:606)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
    	at com.sun.proxy.$Proxy24.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:2131)
    	... 21 more
    ], ROLE_TYPE=[AGENT], EXCEPTION_TYPES=[org.apache.hadoop.security.AccessControlException, org.apache.hadoop.ipc.RemoteException], CATEGORY=[LOG_MESSAGE], ROLE=[flume-AGENT-905e6956a22a22dd64d23e96e4697449], SEVERITY=[IMPORTANT], SERVICE=[flume], HOST_IDS=[c9efd392-7aec-42ca-a20e-3a3745ce74ee], SERVICE_TYPE=[FLUME], LOG_LEVEL=[WARN], HOSTS=[mf-44.com], EVENTCODE=[EV_LOG_EVENT]}, content=HDFS IO error, timestamp=1728858000776}
  • AvroEventWrapper{attributes={STACKTRACE=[org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240921/04/otv/click":root:flume:drwxr-xr-x
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)
    	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2915)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2833)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2718)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:608)
    	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:115)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:412)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
    
    	at sun.reflect.GeneratedConstructorAccessor14.newInstance(Unknown Source)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
    	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:2136)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1803)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1727)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:437)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:433)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:433)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:374)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:926)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:907)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:804)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:793)
    	at org.apache.flume.sink.hdfs.HDFSCompressedDataStream.open(HDFSCompressedDataStream.java:94)
    	at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:269)
    	at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:252)
    	at org.apache.flume.sink.hdfs.BucketWriter$9$1.run(BucketWriter.java:701)
    	at org.apache.flume.auth.SimpleAuthenticator.execute(SimpleAuthenticator.java:50)
    	at org.apache.flume.sink.hdfs.BucketWriter$9.call(BucketWriter.java:698)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240921/04/otv/click":root:flume:drwxr-xr-x
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)
    	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2915)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2833)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2718)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:608)
    	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:115)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:412)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
    
    	at org.apache.hadoop.ipc.Client.call(Client.java:1504)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1441)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
    	at com.sun.proxy.$Proxy23.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:311)
    	at sun.reflect.GeneratedMethodAccessor38.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:606)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
    	at com.sun.proxy.$Proxy24.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:2131)
    	... 21 more
    ], ROLE_TYPE=[AGENT], EXCEPTION_TYPES=[org.apache.hadoop.security.AccessControlException, org.apache.hadoop.ipc.RemoteException], CATEGORY=[LOG_MESSAGE], ROLE=[flume-AGENT-78ad94309f1d685dd767c897ac3dcfa8], SEVERITY=[IMPORTANT], SERVICE=[flume], HOST_IDS=[14e99d03-e70e-4413-a4d5-eb1304aadceb], SERVICE_TYPE=[FLUME], LOG_LEVEL=[WARN], HOSTS=[mf-43.com], EVENTCODE=[EV_LOG_EVENT]}, content=HDFS IO error, timestamp=1728858002438}
  • AvroEventWrapper{attributes={ROLE_TYPE=[NAMENODE], CATEGORY=[LOG_MESSAGE], ROLE=[hdfs-NAMENODE-0ab02a2e1651ccf55f7d96cc4bffc3a4], SEVERITY=[IMPORTANT], SERVICE=[hdfs], HOST_IDS=[4ebb60e6-1f25-4ea4-a06b-b10769b4ffda], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOSTS=[mf-42.com], EVENTCODE=[EV_LOG_EVENT]}, content=PriviledgedActionException as:flume (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240926/23/mobile/click":root:flume:drwxr-xr-x, timestamp=1728858060457}
  • AvroEventWrapper{attributes={STACKTRACE=[java.lang.RuntimeException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=1, exceptions:
    Mon Oct 14 06:21:25 CST 2024, RpcRetryingCaller{globalStartTime=1728858085442, pause=100, retries=1}, org.apache.hadoop.hbase.MasterNotRunningException: java.io.IOException: Can't get master address from ZooKeeper; znode data == null
    
    	at com.google.common.base.Throwables.propagate(Throwables.java:160)
    	at com.cloudera.cmon.firehose.polling.hbase.RegionHealthCanary.doWork(RegionHealthCanary.java:244)
    	at com.cloudera.cmon.firehose.polling.AbstractHConnectionClientTask.doWorkWithClientConfig(AbstractHConnectionClientTask.java:95)
    	at com.cloudera.cmon.firehose.polling.AbstractHConnectionClientTask.doWorkWithClientConfig(AbstractHConnectionClientTask.java:26)
    	at com.cloudera.cmon.firehose.polling.AbstractCdhWorkUsingClientConfigs.doWork(AbstractCdhWorkUsingClientConfigs.java:45)
    	at com.cloudera.cmon.firehose.polling.CdhTask$InstrumentedWork.doWork(CdhTask.java:230)
    	at com.cloudera.cmf.cdhclient.util.ImpersonatingTaskWrapper.runTask(ImpersonatingTaskWrapper.java:72)
    	at com.cloudera.cmf.cdhclient.util.ImpersonatingTaskWrapper.access$000(ImpersonatingTaskWrapper.java:21)
    	at com.cloudera.cmf.cdhclient.util.ImpersonatingTaskWrapper$1.run(ImpersonatingTaskWrapper.java:107)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
    	at com.cloudera.cmf.cdh5client.security.UserGroupInformationImpl.doAs(UserGroupInformationImpl.java:44)
    	at com.cloudera.cmf.cdhclient.util.ImpersonatingTaskWrapper.doWork(ImpersonatingTaskWrapper.java:103)
    	at com.cloudera.cmf.cdhclient.CdhExecutor$1.call(CdhExecutor.java:125)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=1, exceptions:
    Mon Oct 14 06:21:25 CST 2024, RpcRetryingCaller{globalStartTime=1728858085442, pause=100, retries=1}, org.apache.hadoop.hbase.MasterNotRunningException: java.io.IOException: Can't get master address from ZooKeeper; znode data == null
    
    	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:157)
    	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4313)
    	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4305)
    	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:452)
    	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:436)
    	at com.cloudera.cmf.cdh5client.hbase.HBaseAdminImpl.listTables(HBaseAdminImpl.java:39)
    	at com.cloudera.cmon.firehose.polling.hbase.RegionHealthCanary.doWork(RegionHealthCanary.java:220)
    	... 17 more
    Caused by: org.apache.hadoop.hbase.MasterNotRunningException: java.io.IOException: Can't get master address from ZooKeeper; znode data == null
    	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1681)
    	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1701)
    	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1858)
    	at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
    	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:134)
    	... 23 more
    Caused by: java.io.IOException: Can't get master address from ZooKeeper; znode data == null
    	at org.apache.hadoop.hbase.zookeeper.MasterAddressTracker.getMasterAddress(MasterAddressTracker.java:154)
    	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1631)
    	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1672)
    	... 27 more
    ], ROLE_TYPE=[SERVICEMONITOR], EXCEPTION_TYPES=[java.lang.RuntimeException, org.apache.hadoop.hbase.client.RetriesExhaustedException, org.apache.hadoop.hbase.MasterNotRunningException, java.io.IOException], CATEGORY=[LOG_MESSAGE], ROLE=[mgmt-SERVICEMONITOR-3127c5a7e3d69916aa9e0d8ce7430d8a], SEVERITY=[IMPORTANT], SERVICE=[mgmt], HOST_IDS=[b772707b-0199-46ba-9c1f-314e5b71b12f], SERVICE_TYPE=[MGMT], LOG_LEVEL=[WARN], HOSTS=[mf-41.com], EVENTCODE=[EV_LOG_EVENT]}, content=Exception in doWork for task: hbase_HBASE_REGION_HEALTH_CANARY, timestamp=1728858085485}
  • AvroEventWrapper{attributes={EXCEPTION_TYPES=[org.apache.hadoop.hbase.client.RetriesExhaustedException, org.apache.hadoop.hbase.MasterNotRunningException, java.io.IOException], SEVERITY=[IMPORTANT], SERVICE=[hbase], HBASE_CANARY_ERRORS=[org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=1, exceptions:
    Mon Oct 14 06:21:25 CST 2024, RpcRetryingCaller{globalStartTime=1728858085442, pause=100, retries=1}, org.apache.hadoop.hbase.MasterNotRunningException: java.io.IOException: Can't get master address from ZooKeeper; znode data == null
    ], DURATION_MS=[19], HBASE_CANARY_TABLE_RESULTS=[[]], TABLE=[], CATEGORY=[HBASE], HBASE_CANARY_TOTAL_REGIONS=[0], HBASE_CANARY_UNHEALTHY_REGION_COUNT=[0], EVENT_VERSION=[1], REGION=[], HOSTS=[], EVENTCODE=[EV_HBASE_REGION_HEALTH_CANARY_RESULTS]}, content=Error running canary: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=1, exceptions:
    Mon Oct 14 06:21:25 CST 2024, RpcRetryingCaller{globalStartTime=1728858085442, pause=100, retries=1}, org.apache.hadoop.hbase.MasterNotRunningException: java.io.IOException: Can't get master address from ZooKeeper; znode data == null
    
    HBase region health: 0 unhealthy regions
    , timestamp=1728858085443}
  • AvroEventWrapper{attributes={STACKTRACE=[org.springframework.mail.MailAuthenticationException: Authentication failed; nested exception is javax.mail.AuthenticationFailedException: 535 Error: authentication failed, system busy
    
    	at org.springframework.mail.javamail.JavaMailSenderImpl.doSend(JavaMailSenderImpl.java:392)
    	at org.springframework.mail.javamail.JavaMailSenderImpl.send(JavaMailSenderImpl.java:340)
    	at org.springframework.mail.javamail.JavaMailSenderImpl.send(JavaMailSenderImpl.java:355)
    	at org.springframework.mail.javamail.JavaMailSenderImpl.send(JavaMailSenderImpl.java:344)
    	at org.apache.camel.component.mail.MailProducer.process(MailProducer.java:44)
    	at org.apache.camel.impl.converter.AsyncProcessorTypeConverter$ProcessorToAsyncProcessorBridge.process(AsyncProcessorTypeConverter.java:50)
    	at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:77)
    	at org.apache.camel.processor.SendProcessor$2.doInAsyncProducer(SendProcessor.java:104)
    	at org.apache.camel.impl.ProducerCache.doInAsyncProducer(ProducerCache.java:272)
    	at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:98)
    	at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:77)
    	at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:98)
    	at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:89)
    	at org.apache.camel.processor.interceptor.TraceInterceptor.process(TraceInterceptor.java:99)
    	at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:77)
    	at org.apache.camel.processor.RedeliveryErrorHandler.processErrorHandler(RedeliveryErrorHandler.java:299)
    	at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:208)
    	at org.apache.camel.processor.DefaultChannel.process(DefaultChannel.java:269)
    	at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:77)
    	at org.apache.camel.processor.Pipeline.process(Pipeline.java:125)
    	at org.apache.camel.processor.Pipeline.process(Pipeline.java:80)
    	at org.apache.camel.processor.UnitOfWorkProcessor.process(UnitOfWorkProcessor.java:109)
    	at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:77)
    	at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:98)
    	at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:89)
    	at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:68)
    	at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:77)
    	at org.apache.camel.component.seda.SedaConsumer.sendToConsumers(SedaConsumer.java:189)
    	at org.apache.camel.component.seda.SedaConsumer.run(SedaConsumer.java:121)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: javax.mail.AuthenticationFailedException: 535 Error: authentication failed, system busy
    
    	at com.sun.mail.smtp.SMTPTransport$Authenticator.authenticate(SMTPTransport.java:648)
    	at com.sun.mail.smtp.SMTPTransport.protocolConnect(SMTPTransport.java:583)
    	at javax.mail.Service.connect(Service.java:313)
    	at org.springframework.mail.javamail.JavaMailSenderImpl.doSend(JavaMailSenderImpl.java:389)
    	... 31 more
    ], ROLE_TYPE=[ALERTPUBLISHER], EXCEPTION_TYPES=[org.springframework.mail.MailAuthenticationException, javax.mail.AuthenticationFailedException], CATEGORY=[LOG_MESSAGE], ROLE=[mgmt-ALERTPUBLISHER-3127c5a7e3d69916aa9e0d8ce7430d8a], SEVERITY=[CRITICAL], SERVICE=[mgmt], HOST_IDS=[b772707b-0199-46ba-9c1f-314e5b71b12f], SERVICE_TYPE=[MGMT], LOG_LEVEL=[ERROR], HOSTS=[mf-41.com], EVENTCODE=[EV_LOG_EVENT]}, content=Failed delivery for exchangeId: ID-mf-41-com-42455-1680053374007-0-16369083. Exhausted after delivery attempt: 1 caught: org.springframework.mail.MailAuthenticationException: Authentication failed; nested exception is javax.mail.AuthenticationFailedException: 535 Error: authentication failed, system busy
    , timestamp=1728858097629}
  • AvroEventWrapper{attributes={ROLE_TYPE=[NAMENODE], CATEGORY=[LOG_MESSAGE], ROLE=[hdfs-NAMENODE-0ab02a2e1651ccf55f7d96cc4bffc3a4], SEVERITY=[IMPORTANT], SERVICE=[hdfs], HOST_IDS=[4ebb60e6-1f25-4ea4-a06b-b10769b4ffda], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOSTS=[mf-42.com], EVENTCODE=[EV_LOG_EVENT]}, content=PriviledgedActionException as:flume (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240921/06/otv/show":root:flume:drwxr-xr-x, timestamp=1728858120398}
  • AvroEventWrapper{attributes={STACKTRACE=[org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240921/06/otv/show":root:flume:drwxr-xr-x
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)
    	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2915)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2833)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2718)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:608)
    	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:115)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:412)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
    
    	at sun.reflect.GeneratedConstructorAccessor14.newInstance(Unknown Source)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
    	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:2136)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1803)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1727)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:437)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:433)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:433)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:374)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:926)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:907)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:804)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:793)
    	at org.apache.flume.sink.hdfs.HDFSCompressedDataStream.open(HDFSCompressedDataStream.java:94)
    	at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:269)
    	at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:252)
    	at org.apache.flume.sink.hdfs.BucketWriter$9$1.run(BucketWriter.java:701)
    	at org.apache.flume.auth.SimpleAuthenticator.execute(SimpleAuthenticator.java:50)
    	at org.apache.flume.sink.hdfs.BucketWriter$9.call(BucketWriter.java:698)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240921/06/otv/show":root:flume:drwxr-xr-x
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)
    	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2915)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2833)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2718)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:608)
    	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:115)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:412)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
    
    	at org.apache.hadoop.ipc.Client.call(Client.java:1504)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1441)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
    	at com.sun.proxy.$Proxy23.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:311)
    	at sun.reflect.GeneratedMethodAccessor38.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:606)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
    	at com.sun.proxy.$Proxy24.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:2131)
    	... 21 more
    ], ROLE_TYPE=[AGENT], EXCEPTION_TYPES=[org.apache.hadoop.security.AccessControlException, org.apache.hadoop.ipc.RemoteException], CATEGORY=[LOG_MESSAGE], ROLE=[flume-AGENT-78ad94309f1d685dd767c897ac3dcfa8], SEVERITY=[IMPORTANT], SERVICE=[flume], HOST_IDS=[14e99d03-e70e-4413-a4d5-eb1304aadceb], SERVICE_TYPE=[FLUME], LOG_LEVEL=[WARN], HOSTS=[mf-43.com], EVENTCODE=[EV_LOG_EVENT]}, content=HDFS IO error, timestamp=1728858120380}
  • AvroEventWrapper{attributes={STACKTRACE=[org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240921/10/otv/click":root:flume:drwxr-xr-x
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)
    	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2915)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2833)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2718)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:608)
    	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:115)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:412)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
    
    	at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown Source)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
    	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:2136)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1803)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1727)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:437)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:433)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:433)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:374)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:926)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:907)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:804)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:793)
    	at org.apache.flume.sink.hdfs.HDFSCompressedDataStream.open(HDFSCompressedDataStream.java:94)
    	at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:269)
    	at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:252)
    	at org.apache.flume.sink.hdfs.BucketWriter$9$1.run(BucketWriter.java:701)
    	at org.apache.flume.auth.SimpleAuthenticator.execute(SimpleAuthenticator.java:50)
    	at org.apache.flume.sink.hdfs.BucketWriter$9.call(BucketWriter.java:698)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240921/10/otv/click":root:flume:drwxr-xr-x
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)
    	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2915)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2833)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2718)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:608)
    	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:115)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:412)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
    
    	at org.apache.hadoop.ipc.Client.call(Client.java:1504)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1441)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
    	at com.sun.proxy.$Proxy23.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:311)
    	at sun.reflect.GeneratedMethodAccessor38.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:606)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
    	at com.sun.proxy.$Proxy24.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:2131)
    	... 21 more
    ], ROLE_TYPE=[AGENT], EXCEPTION_TYPES=[org.apache.hadoop.security.AccessControlException, org.apache.hadoop.ipc.RemoteException], CATEGORY=[LOG_MESSAGE], ROLE=[flume-AGENT-905e6956a22a22dd64d23e96e4697449], SEVERITY=[IMPORTANT], SERVICE=[flume], HOST_IDS=[c9efd392-7aec-42ca-a20e-3a3745ce74ee], SERVICE_TYPE=[FLUME], LOG_LEVEL=[WARN], HOSTS=[mf-44.com], EVENTCODE=[EV_LOG_EVENT]}, content=HDFS IO error, timestamp=1728858121120}
  • AvroEventWrapper{attributes={ROLE_TYPE=[NAMENODE], CATEGORY=[LOG_MESSAGE], ROLE=[hdfs-NAMENODE-0ab02a2e1651ccf55f7d96cc4bffc3a4], SEVERITY=[IMPORTANT], SERVICE=[hdfs], HOST_IDS=[4ebb60e6-1f25-4ea4-a06b-b10769b4ffda], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOSTS=[mf-42.com], EVENTCODE=[EV_LOG_EVENT]}, content=PriviledgedActionException as:flume (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20241001/08/mobile/show":root:flume:drwxr-xr-x, timestamp=1728858180126}
  • AvroEventWrapper{attributes={ROLE_TYPE=[NAMENODE], CATEGORY=[LOG_MESSAGE], ROLE=[hdfs-NAMENODE-0ab02a2e1651ccf55f7d96cc4bffc3a4], SEVERITY=[IMPORTANT], SERVICE=[hdfs], HOST_IDS=[4ebb60e6-1f25-4ea4-a06b-b10769b4ffda], SERVICE_TYPE=[HDFS], LOG_LEVEL=[WARN], HOSTS=[mf-42.com], EVENTCODE=[EV_LOG_EVENT]}, content=PriviledgedActionException as:flume (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240926/23/mobile/click":root:flume:drwxr-xr-x, timestamp=1728858240138}
  • AvroEventWrapper{attributes={STACKTRACE=[org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240926/23/mobile/click":root:flume:drwxr-xr-x
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)
    	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2915)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2833)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2718)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:608)
    	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:115)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:412)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
    
    	at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown Source)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
    	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:2136)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1803)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1727)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:437)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:433)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:433)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:374)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:926)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:907)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:804)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:793)
    	at org.apache.flume.sink.hdfs.HDFSCompressedDataStream.open(HDFSCompressedDataStream.java:94)
    	at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:269)
    	at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:252)
    	at org.apache.flume.sink.hdfs.BucketWriter$9$1.run(BucketWriter.java:701)
    	at org.apache.flume.auth.SimpleAuthenticator.execute(SimpleAuthenticator.java:50)
    	at org.apache.flume.sink.hdfs.BucketWriter$9.call(BucketWriter.java:698)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20240926/23/mobile/click":root:flume:drwxr-xr-x
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)
    	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2915)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2833)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2718)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:608)
    	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:115)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:412)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
    
    	at org.apache.hadoop.ipc.Client.call(Client.java:1504)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1441)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
    	at com.sun.proxy.$Proxy23.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:311)
    	at sun.reflect.GeneratedMethodAccessor38.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:606)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
    	at com.sun.proxy.$Proxy24.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:2131)
    	... 21 more
    ], ROLE_TYPE=[AGENT], EXCEPTION_TYPES=[org.apache.hadoop.security.AccessControlException, org.apache.hadoop.ipc.RemoteException], CATEGORY=[LOG_MESSAGE], ROLE=[flume-AGENT-905e6956a22a22dd64d23e96e4697449], SEVERITY=[IMPORTANT], SERVICE=[flume], HOST_IDS=[c9efd392-7aec-42ca-a20e-3a3745ce74ee], SERVICE_TYPE=[FLUME], LOG_LEVEL=[WARN], HOSTS=[mf-44.com], EVENTCODE=[EV_LOG_EVENT]}, content=HDFS IO error, timestamp=1728858240113}
  • AvroEventWrapper{attributes={STACKTRACE=[org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=1, exceptions:
    Mon Oct 14 06:24:00 CST 2024, RpcRetryingCaller{globalStartTime=1728858240460, pause=100, retries=1}, org.apache.hadoop.hbase.MasterNotRunningException: java.io.IOException: Can't get master address from ZooKeeper; znode data == null
    
    	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:157)
    	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4313)
    	at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4305)
    	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:452)
    	at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:436)
    	at com.cloudera.cmf.cdh5client.hbase.HBaseAdminImpl.listTables(HBaseAdminImpl.java:39)
    	at com.cloudera.cmon.firehose.polling.hbase.TableAndRegionInfoFetcher.getActiveHBaseInfo(TableAndRegionInfoFetcher.java:257)
    	at com.cloudera.cmon.firehose.polling.hbase.TableAndRegionInfoFetcher.doWork(TableAndRegionInfoFetcher.java:108)
    	at com.cloudera.cmon.firehose.polling.AbstractHConnectionClientTask.doWorkWithClientConfig(AbstractHConnectionClientTask.java:95)
    	at com.cloudera.cmon.firehose.polling.AbstractHConnectionClientTask.doWorkWithClientConfig(AbstractHConnectionClientTask.java:26)
    	at com.cloudera.cmon.firehose.polling.AbstractCdhWorkUsingClientConfigs.doWork(AbstractCdhWorkUsingClientConfigs.java:45)
    	at com.cloudera.cmon.firehose.polling.CdhTask$InstrumentedWork.doWork(CdhTask.java:230)
    	at com.cloudera.cmf.cdhclient.util.ImpersonatingTaskWrapper.runTask(ImpersonatingTaskWrapper.java:72)
    	at com.cloudera.cmf.cdhclient.util.ImpersonatingTaskWrapper.access$000(ImpersonatingTaskWrapper.java:21)
    	at com.cloudera.cmf.cdhclient.util.ImpersonatingTaskWrapper$1.run(ImpersonatingTaskWrapper.java:107)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
    	at com.cloudera.cmf.cdh5client.security.UserGroupInformationImpl.doAs(UserGroupInformationImpl.java:44)
    	at com.cloudera.cmf.cdhclient.util.ImpersonatingTaskWrapper.doWork(ImpersonatingTaskWrapper.java:103)
    	at com.cloudera.cmf.cdhclient.CdhExecutor$1.call(CdhExecutor.java:125)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: org.apache.hadoop.hbase.MasterNotRunningException: java.io.IOException: Can't get master address from ZooKeeper; znode data == null
    	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1681)
    	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1701)
    	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1858)
    	at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
    	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:134)
    	... 24 more
    Caused by: java.io.IOException: Can't get master address from ZooKeeper; znode data == null
    	at org.apache.hadoop.hbase.zookeeper.MasterAddressTracker.getMasterAddress(MasterAddressTracker.java:154)
    	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1631)
    	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1672)
    	... 28 more
    ], ROLE_TYPE=[SERVICEMONITOR], EXCEPTION_TYPES=[org.apache.hadoop.hbase.client.RetriesExhaustedException, org.apache.hadoop.hbase.MasterNotRunningException, java.io.IOException], CATEGORY=[LOG_MESSAGE], ROLE=[mgmt-SERVICEMONITOR-3127c5a7e3d69916aa9e0d8ce7430d8a], SEVERITY=[IMPORTANT], SERVICE=[mgmt], HOST_IDS=[b772707b-0199-46ba-9c1f-314e5b71b12f], SERVICE_TYPE=[MGMT], LOG_LEVEL=[WARN], HOSTS=[mf-41.com], EVENTCODE=[EV_LOG_EVENT]}, content=(14 skipped) Exception in doWork for task: hbase_HBASE_TABLE_AND_REGION_INFO_TASK, timestamp=1728858240529}
  • AvroEventWrapper{attributes={STACKTRACE=[org.apache.hadoop.security.AccessControlException: Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20241001/08/mobile/show":root:flume:drwxr-xr-x
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)
    	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2915)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2833)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2718)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:608)
    	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:115)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:412)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
    
    	at sun.reflect.GeneratedConstructorAccessor14.newInstance(Unknown Source)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
    	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:2136)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1803)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1727)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:437)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:433)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:433)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:374)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:926)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:907)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:804)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:793)
    	at org.apache.flume.sink.hdfs.HDFSCompressedDataStream.open(HDFSCompressedDataStream.java:94)
    	at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:269)
    	at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:252)
    	at org.apache.flume.sink.hdfs.BucketWriter$9$1.run(BucketWriter.java:701)
    	at org.apache.flume.auth.SimpleAuthenticator.execute(SimpleAuthenticator.java:50)
    	at org.apache.flume.sink.hdfs.BucketWriter$9.call(BucketWriter.java:698)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=flume, access=WRITE, inode="/user/flume/mdfull/ssp/20241001/08/mobile/show":root:flume:drwxr-xr-x
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)
    	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)
    	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860)
    	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3842)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6762)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2915)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2833)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2718)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:608)
    	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:115)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:412)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)
    
    	at org.apache.hadoop.ipc.Client.call(Client.java:1504)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1441)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
    	at com.sun.proxy.$Proxy23.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:311)
    	at sun.reflect.GeneratedMethodAccessor38.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:606)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
    	at com.sun.proxy.$Proxy24.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:2131)
    	... 21 more
    ], ROLE_TYPE=[AGENT], EXCEPTION_TYPES=[org.apache.hadoop.security.AccessControlException, org.apache.hadoop.ipc.RemoteException], CATEGORY=[LOG_MESSAGE], ROLE=[flume-AGENT-78ad94309f1d685dd767c897ac3dcfa8], SEVERITY=[IMPORTANT], SERVICE=[flume], HOST_IDS=[14e99d03-e70e-4413-a4d5-eb1304aadceb], SERVICE_TYPE=[FLUME], LOG_LEVEL=[WARN], HOSTS=[mf-43.com], EVENTCODE=[EV_LOG_EVENT]}, content=HDFS IO error, timestamp=1728858240661}


EventMetricsPublisher (com.cloudera.cmf.eventcatcher.server.EventMetricsPublisher)


Currently not running.
3312543 runs so far (0 slow), of them 0 reported exceptions.
Last duration: 17 ms. Total duration: 64981758 ms.
Last start: October 14, 2024 6:24:30 AM CST. Last end: October 14, 2024 6:24:30 AM CST.

Event Schema Checker (com.cloudera.cmf.eventcatcher.server.EventSchemaChecker)

Num events without event codes: 0
Num events with unknown event codes: 0
Num events not matching schema: 0

Events not matching schema:


    EventIngester (com.cloudera.cmon.pipeline.Pipeline)

    Pipeline stage summary:
    namedroppedskippedforwardedprocessedsize
    Pipeline Stage tagger-writer000424145650
    A total of 42414565 messages received. Current rate: 0.00 msg/sec. Max rate: 84.81 msg/sec.

    Pipeline Stage tagger-writer (com.cloudera.cmon.pipeline.PipelineStage)

    Input queue size is 0 (of 10000).
    42414565 events processed.
    0 events skipped.
    0 events forwarded.
    0 events dropped.
    Current rate: 0.00 msg/sec. Max rate: 84.81 msg/sec.
    Thread pool size is 2 (max 4).
    First element in queue: null


    Events queryer (com.cloudera.cmf.eventcatcher.server.EventsQueryer)

    Seen 19962048 queries
    Last query took 2ms
    Last 20 queries:
    • +__persist_timestamp:[1728858002439 TO 1728858675751]
    • +__persist_timestamp:[1728858060442 TO 1728858685751]
    • +__persist_timestamp:[1728858060442 TO 1728858695751]
    • +__persist_timestamp:[1728858085488 TO 1728858705751]
    • +__persist_timestamp:[1728858097649 TO 1728858715751]
    • +__persist_timestamp:[1728858097649 TO 1728858725751]
    • +__persist_timestamp:[1728858121123 TO 1728858735751]
    • +__persist_timestamp:[1728858121123 TO 1728858745751]
    • +__persist_timestamp:[1728858121123 TO 1728858755751]
    • +__persist_timestamp:[1728858121123 TO 1728858765751]
    • +__persist_timestamp:[1728858121123 TO 1728858775751]
    • +__persist_timestamp:[1728858121123 TO 1728858785751]
    • +__persist_timestamp:[1728858180107 TO 1728858795751]
    • +__persist_timestamp:[1728858180107 TO 1728858805751]
    • +__persist_timestamp:[1728858180107 TO 1728858815751]
    • +__persist_timestamp:[1728858180107 TO 1728858825751]
    • +__persist_timestamp:[1728858180107 TO 1728858835751]
    • +__persist_timestamp:[1728858180107 TO 1728858845751]
    • +__persist_timestamp:[1728858180107 TO 1728858855751]
    • +__persist_timestamp:[1728858240662 TO 1728858865751]


    Lucene index manager (com.cloudera.cmf.eventcatcher.server.SingleIndexManager)

    There are 5000008 events in the index with 526774 deleted events waiting to be expunged.
    Last searcher refresh occurred 4.09 secs ago and took 0.001 secs.
    Last commit occurred 16.45 secs ago and took 0.044 secs.
    Last cleanup occurred 146.07 secs ago and took 2.089 secs.


    Event Debug Server (com.cloudera.enterprise.DebugServer)

    Web server is running? true

    AvroEventStoreServer (com.cloudera.cmf.eventcatcher.server.AvroEventStoreServer)

    Avro NettyServer is running on port 7184
    20 max worker threads


    AvroEventStoreHttpService (com.cloudera.cmf.eventcatcher.server.AvroEventStoreHttpService)

    Avro HttpServer is running on port 7185