Tag Archives: apache

[Solved] shiro Error: org.apache.shiro.authc.UnknownAccountException:

Error reporting: error reporting during login

org.apache.shiro.authc.UnknownAccountException: Realm [org.apache.shiro.realm.SimpleAccountRealm@30cecdca] was unable to find account data for the submitted AuthenticationToken [org.apache.shiro.authc.UsernamePasswordToken - wmyskxz, rememberMe=false].

	at org.apache.shiro.authc.pam.ModularRealmAuthenticator.doSingleRealmAuthentication(ModularRealmAuthenticator.java:184)
	at org.apache.shiro.authc.pam.ModularRealmAuthenticator.doAuthenticate(ModularRealmAuthenticator.java:267)
	at org.apache.shiro.authc.AbstractAuthenticator.authenticate(AbstractAuthenticator.java:198)
	at org.apache.shiro.mgt.AuthenticatingSecurityManager.authenticate(AuthenticatingSecurityManager.java:106)
	at org.apache.shiro.mgt.DefaultSecurityManager.login(DefaultSecurityManager.java:274)
	at org.apache.shiro.subject.support.DelegatingSubject.login(DelegatingSubject.java:260)
	at com.example.demo.DemoApplicationTests.contextLoads(DemoApplicationTests.java:33)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:725)
	at org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60)
	at org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131)
	at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149)
	at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:140)
	at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:84)
	at org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115)
	at org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105)
	at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106)
	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64)
	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45)
	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37)
	at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104)
	at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98)
	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$7(TestMethodTestDescriptor.java:214)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:210)
	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:135)
	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:66)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
	at java.util.ArrayList.forEach(ArrayList.java:1257)
	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
	at java.util.ArrayList.forEach(ArrayList.java:1257)
	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35)
	at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57)
	at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54)
	at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:107)
	at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:88)
	at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:54)
	at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:67)
	at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:52)
	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:114)
	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:86)
	at org.junit.platform.launcher.core.DefaultLauncherSession$DelegatingLauncher.execute(DefaultLauncherSession.java:86)
	at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:53)
	at com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:69)
	at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
	at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
	at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)

code:

package com.example.demo;

import org.apache.shiro.SecurityUtils;
import org.apache.shiro.authc.UsernamePasswordToken;
import org.apache.shiro.mgt.DefaultSecurityManager;
import org.apache.shiro.realm.SimpleAccountRealm;
import org.apache.shiro.subject.Subject;
import org.junit.Before;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.springframework.boot.test.context.SpringBootTest;


@SpringBootTest
public class DemoApplicationTests {

    private SimpleAccountRealm simpleAccountRealm = new SimpleAccountRealm();

    @Before
    public void addUsr(){
        simpleAccountRealm.addAccount("wmyskxz","123456");
    }

    @Test
    void contextLoads() {
        DefaultSecurityManager defaultSecurityManager =  new DefaultSecurityManager();
        defaultSecurityManager.setRealm(simpleAccountRealm);

        SecurityUtils.setSecurityManager(defaultSecurityManager);
        Subject subject = SecurityUtils.getSubject();

        UsernamePasswordToken token = new UsernamePasswordToken("wmyskxz","123456");
        subject.login(token);

        System.out.println("isAuthenticated:"+subject.isAuthenticated());

        subject.logout();

        System.out.println("isAuthenticated:"+subject.isAuthenticated());

    }

}

The reason for the error is that Junit’s @Before did not register the account up, and the version of Junit, Spring Boot 2.2.9 default use is junit 5, in junit 5 @Before was @BeforeEach instead of @Before change @Before to @BeforeEach on it.

[Solved] tp6.0 open_basedir Error: Warning: require(): open_basedir restriction in effect.

Error message

Warning: require(): open_basedir restriction in effect. File(/work/tp6/vendor/autoload.php) is not within the allowed path(s): (/work/tp6/public/:/tmp/) in /work/tp6/public/index.php on line 15

Warning: require(/work/tp6/vendor/autoload.php): failed to open stream: Operation not permitted in /work/tp6/public/index.php on line 15

Fatal error: require(): Failed opening required '/work/tp6/public/../vendor/autoload.php' (include_path='.:') in /work/tp6/public/index.php on line 15

 

Check the problem description and data, and find that it is the problem of PHP open_basedir  configuration is that PHP cannot import the files above the authorized directory;

In general, this problem will not occur. The reason for this problem is mostly due to the server. This restriction is made for security!

If the entry file of is switched to public, the following file should be modified to remove public

Solution:

[Solved] SpringMVC Shiro Error: org.apache.shiro.UnavailableSecurityManagerException No SecurityManager

The configuration is as follows:

    <!-- shiro -->
    <filter>
        <filter-name>shiroFilter</filter-name>
        <filter-class>org.springframework.web.filter.DelegatingFilterProxy</filter-class>
        <init-param>
            <param-name>targetFilterLifecycle</param-name>
            <param-value>true</param-value>
        </init-param>
    </filter>
    <filter-mapping>
        <filter-name>shiroFilter</filter-name>
        <url-pattern>/</url-pattern>
    </filter-mapping>

The error is reported as follows:

SEVERE: Servlet.service() for servlet [springMVC] in context with path [/myweb] threw exception [Request processing failed; nested exception is org.apache.shiro.UnavailableSecurityManagerException: No SecurityManager accessible to the calling code, either bound to the org.apache.shiro.util.ThreadContext or as a vm static singleton.  This is an invalid application configuration.] with root cause
org.apache.shiro.UnavailableSecurityManagerException: No SecurityManager accessible to the calling code, either bound to the org.apache.shiro.util.ThreadContext or as a vm static singleton.  This is an invalid application configuration.
    at org.apache.shiro.SecurityUtils.getSecurityManager(SecurityUtils.java:123)
    at org.apache.shiro.subject.Subject$Builder.<init>(Subject.java:627)
    at org.apache.shiro.SecurityUtils.getSubject(SecurityUtils.java:56)
    at com.myweb.controller.UserController.login(UserController.java:136)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:221)
    at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:137)
    at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:110)
    at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandleMethod(RequestMappingHandlerAdapter.java:776)
    at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:705)
    at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
    at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:959)
    at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:893)
    at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:966)
    at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:868)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:648)
    at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:842)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.springframework.orm.hibernate4.support.OpenSessionInViewFilter.doFilterInternal(OpenSessionInViewFilter.java:151)
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:85)
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
    at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
    at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:617)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:518)
    at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1091)
    at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:668)
    at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1527)
    at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1484)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
    at java.lang.Thread.run(Thread.java:722)

Solution:

Replace “/” to “/*” in url-pattern

    <!-- shiro -->
    <filter>
        <filter-name>shiroFilter</filter-name>
        <filter-class>org.springframework.web.filter.DelegatingFilterProxy</filter-class>
        <init-param>
            <param-name>targetFilterLifecycle</param-name>
            <param-value>true</param-value>
        </init-param>
    </filter>
    <filter-mapping>
        <filter-name>shiroFilter</filter-name>
        <url-pattern>/*</url-pattern>
    </filter-mapping>

[Solved] nested exception is org.apache.ibatis.builder.BuilderException: Error invoking SqlProvider method

Solve the nested exception is org.apache.ibatis.builder BuilderException: Error invoking SqlProvider method (tk.mybatis.mapper.provider.base.BaseInsertProvider.dynamicSQL)

When Java uses general Mapper in Spring Boot, the following error occurs:

java.lang.NoSuchMethodException: tk.mybatis.mapper.provider.base.BaseSelectProvider.<init>()
	at java.lang.Class.getConstructor0(Class.java:3082) ~[na:1.8.0_181]
	at java.lang.Class.newInstance(Class.java:412) ~[na:1.8.0_181]
	at org.apache.ibatis.builder.annotation.ProviderSqlSource.invokeProviderMethod(ProviderSqlSource.java:165) ~[mybatis-3.4.6.jar:3.4.6]
	at org.apache.ibatis.builder.annotation.ProviderSqlSource.createSqlSource(ProviderSqlSource.java:116) ~[mybatis-3.4.6.jar:3.4.6]
	at org.apache.ibatis.builder.annotation.ProviderSqlSource.getBoundSql(ProviderSqlSource.java:102) ~[mybatis-3.4.6.jar:3.4.6]
	at org.apache.ibatis.mapping.MappedStatement.getBoundSql(MappedStatement.java:292) ~[mybatis-3.4.6.jar:3.4.6]
	at com.github.pagehelper.PageInterceptor.intercept(PageInterceptor.java:83) ~[pagehelper-5.1.2.jar:na]
	at org.apache.ibatis.plugin.Plugin.invoke(Plugin.java:61) ~[mybatis-3.4.6.jar:3.4.6]
	at com.sun.proxy.$Proxy137.query(Unknown Source) ~[na:na]
	at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:148) ~[mybatis-3.4.6.jar:3.4.6]
	at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:141) ~[mybatis-3.4.6.jar:3.4.6]
	at org.apache.ibatis.session.defaults.DefaultSqlSession.selectOne(DefaultSqlSession.java:77) ~[mybatis-3.4.6.jar:3.4.6]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_181]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_181]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_181]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_181]
	at org.mybatis.spring.SqlSessionTemplate$SqlSessionInterceptor.invoke(SqlSessionTemplate.java:433) ~[mybatis-spring-1.3.2.jar:1.3.2]
	at com.sun.proxy.$Proxy84.selectOne(Unknown Source) ~[na:na]
	at org.mybatis.spring.SqlSessionTemplate.selectOne(SqlSessionTemplate.java:166) ~[mybatis-spring-1.3.2.jar:1.3.2]
	at org.apache.ibatis.binding.MapperMethod.execute(MapperMethod.java:83) ~[mybatis-3.4.6.jar:3.4.6]
	at org.apache.ibatis.binding.MapperProxy.invoke(MapperProxy.java:59) ~[mybatis-3.4.6.jar:3.4.6]
	at com.sun.proxy.$Proxy97.selectByPrimaryKey(Unknown Source) ~[na:na]
	at com.user.service.impl.UserServiceImpl.findById(UserServiceImpl.java:197) ~[classes/:na]
	at com.user.controller.UserController.login(UserController.java:41) ~[classes/:na]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_181]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_181]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_181]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_181]
	at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:189) ~[spring-web-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:138) ~[spring-web-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:102) ~[spring-webmvc-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:892) ~[spring-webmvc-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:797) ~[spring-webmvc-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) ~[spring-webmvc-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1038) ~[spring-webmvc-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:942) ~[spring-webmvc-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1005) ~[spring-webmvc-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:897) ~[spring-webmvc-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:634) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882) ~[spring-webmvc-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:741) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53) ~[tomcat-embed-websocket-9.0.17.jar:9.0.17]
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99) ~[spring-web-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:92) ~[spring-web-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93) ~[spring-web-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200) ~[spring-web-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.1.6.RELEASE.jar:5.1.6.RELEASE]
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:200) ~[tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96) [tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:490) [tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:139) [tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92) [tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74) [tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:343) [tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:408) [tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66) [tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:834) [tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1415) [tomcat-embed-core-9.0.17.jar:9.0.17]
	at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) [tomcat-embed-core-9.0.17.jar:9.0.17]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_181]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_181]
	at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-9.0.17.jar:9.0.17]
	at java.lang.Thread.run(Thread.java:748) [na:1.8.0_181]


The possible reason for this error is the package import error of @MapperScan

The correct usage should be import:

import tk.mybatis.spring.annotation.MapperScan;

[Solved] presto Compile Error: Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:3.0.0:check

The error information is as follows:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:3.0.0:check (checkstyle) on project presto-parser: You have 4 Checkstyle violations. -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:3.0.0:check (checkstyle) on project presto-parser: You have 4 Checkstyle violations.
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:215)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:196)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:186)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
    at java.lang.Thread.run (Thread.java:748)
Caused by: org.apache.maven.plugin.MojoFailureException: You have 4 Checkstyle violations.
    at org.apache.maven.plugins.checkstyle.CheckstyleViolationCheckMojo.execute (CheckstyleViolationCheckMojo.java:576)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:196)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:186)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
    at java.lang.Thread.run (Thread.java:748)
[ERROR] 
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <args> -rf :presto-parser

Compile again

Failed to get convolution algorithm. This is probably because cuDNN failed to initialize,

For my personal situation, just add the following code

os.environ["CUDA_DEVICE_ORDER"] = "PCI_BUS_ID"
os.environ["CUDA_VISIBLE_DEVICES"] = "0"  

gpus = tf.config.experimental.list_physical_devices("GPU")
if gpus:
    try:
        for gpu in gpus:
            tf.config.experimental.set_memory_growth(gpu, True)
    except RuntimeError as e:
        print(e)
        exit(-1)

Reference:

https://stackoverflow.com/questions/53698035/failed-to-get-convolution-algorithm-this-is-probably-because-cudnn-failed-to-in

[Solved] ERROR SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: Could not pars

ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Could not parse Master URL: ‘<pyspark.conf.SparkConf object at 0x7fb5da40a850>’
Error Messages:

22/03/14 10:58:26 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
22/03/14 10:58:26 INFO SparkContext: Running Spark version 3.0.0
22/03/14 10:58:26 INFO ResourceUtils: ==============================================================
22/03/14 10:58:26 INFO ResourceUtils: Resources for spark.driver:

22/03/14 10:58:26 INFO ResourceUtils: ==============================================================
22/03/14 10:58:26 INFO SparkContext: Submitted application: Spark01_FindWord.py
22/03/14 10:58:26 INFO SecurityManager: Changing view acls to: root
22/03/14 10:58:26 INFO SecurityManager: Changing modify acls to: root
22/03/14 10:58:26 INFO SecurityManager: Changing view acls groups to: 
22/03/14 10:58:26 INFO SecurityManager: Changing modify acls groups to: 
22/03/14 10:58:26 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
22/03/14 10:58:27 INFO Utils: Successfully started service 'sparkDriver' on port 44307.
22/03/14 10:58:27 INFO SparkEnv: Registering MapOutputTracker
22/03/14 10:58:27 INFO SparkEnv: Registering BlockManagerMaster
22/03/14 10:58:27 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/03/14 10:58:27 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
22/03/14 10:58:27 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
22/03/14 10:58:27 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-a4554f81-46f5-4499-b6fd-2d5a9005552b
22/03/14 10:58:27 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB
22/03/14 10:58:27 INFO SparkEnv: Registering OutputCommitCoordinator
22/03/14 10:58:27 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
22/03/14 10:58:27 WARN Utils: Service 'SparkUI' could not bind on port 4041. Attempting port 4042.
22/03/14 10:58:27 WARN Utils: Service 'SparkUI' could not bind on port 4042. Attempting port 4043.
22/03/14 10:58:27 WARN Utils: Service 'SparkUI' could not bind on port 4043. Attempting port 4044.
22/03/14 10:58:27 WARN Utils: Service 'SparkUI' could not bind on port 4044. Attempting port 4045.
22/03/14 10:58:27 INFO Utils: Successfully started service 'SparkUI' on port 4045.
22/03/14 10:58:27 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://node01:4045
22/03/14 10:58:27 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Could not parse Master URL: '<pyspark.conf.SparkConf object at 0x7fb5da40a850>'
	at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2924)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:528)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
	at py4j.Gateway.invoke(Gateway.java:238)
	at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
	at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
	at py4j.GatewayConnection.run(GatewayConnection.java:238)
	at java.lang.Thread.run(Thread.java:748)
22/03/14 10:58:27 INFO SparkUI: Stopped Spark web UI at http://node01:4045
22/03/14 10:58:27 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
22/03/14 10:58:27 INFO MemoryStore: MemoryStore cleared
22/03/14 10:58:27 INFO BlockManager: BlockManager stopped
22/03/14 10:58:28 INFO BlockManagerMaster: BlockManagerMaster stopped
22/03/14 10:58:28 WARN MetricsSystem: Stopping a MetricsSystem that is not running
22/03/14 10:58:28 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
22/03/14 10:58:28 INFO SparkContext: Successfully stopped SparkContext
Traceback (most recent call last):
  File "/ext/servers/spark3.0/MyCode/Spark01_FindWord.py", line 4, in <module>
    sc = SparkContext(conf)
  File "/ext/servers/spark3.0/python/lib/pyspark.zip/pyspark/context.py", line 131, in __init__
  File "/ext/servers/spark3.0/python/lib/pyspark.zip/pyspark/context.py", line 193, in _do_init
  File "/ext/servers/spark3.0/python/lib/pyspark.zip/pyspark/context.py", line 310, in _initialize_context
  File "/ext/servers/spark3.0/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py", line 1569, in __call__
  File "/ext/servers/spark3.0/python/lib/py4j-0.10.9-src.zip/py4j/protocol.py", line 328, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: org.apache.spark.SparkException: Could not parse Master URL: '<pyspark.conf.SparkConf object at 0x7fb5da40a850>'
	at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2924)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:528)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
	at py4j.Gateway.invoke(Gateway.java:238)
	at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
	at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
	at py4j.GatewayConnection.run(GatewayConnection.java:238)
	at java.lang.Thread.run(Thread.java:748)

22/03/14 10:58:28 INFO ShutdownHookManager: Shutdown hook called
22/03/14 10:58:28 INFO ShutdownHookManager: Deleting directory /tmp/spark-32987338-486b-448a-b9f4-3cf4d68aa85d
22/03/14 10:58:28 INFO ShutdownHookManager: Deleting directory /tmp/spark-2160f024-2430-4ab1-b5ed-5c3afc402351
(Spark_python) root@node01:/ext/servers/spark3.0/MyCode# spark-submit Spark01_FindWord.py 

Modification method

error code

from pyspark import SparkConf, SparkContext

conf = SparkConf().setMaster("local").setAppName("MyApp")
sc = SparkContext(conf)
hdfs_file = "hdfs://node01:/user/hadoop/test06.txt"
hdfs_rdd = sc.textFile(hdfs_file).count()
print(hdfs_rdd)

Modified code

from pyspark import SparkConf, SparkContext

conf = SparkConf().setMaster("local").setAppName("MyApp")
sc = SparkContext(conf=conf)
hdfs_file = "hdfs://node01:/user/hadoop/test06.txt"
hdfs_rdd = sc.textFile(hdfs_file).count()
print(hdfs_rdd)


Summary

The first is the parameter transfer of the sparkcontext method. If we don’t specify it, you can’t write conf directly for default parameter transfer, so an error will be reported later, so we need to specify the parameters

[Solved] org.apache.flink.client.program.ProgramInvocationException: The main method caused an error

After the flick task starts the checkpoint and sets the status backend, the task is submitted for operation. The above errors occur. The specific errors are as follows:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
  at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:366)
  at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:219)
  at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114)
  at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:812)
  at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:246)
  at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1054)
  at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1132)
  at java.security.AccessController.doPrivileged(Native Method)
  at javax.security.auth.Subject.doAs(Subject.java:422)
  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1761)
  at org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
  at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1132)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
  at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
  at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
  at org.apache.flink.client.program.StreamContextEnvironment.getJobExecutionResult(StreamContextEnvironment.java:123)
  at org.apache.flink.client.program.StreamContextEnvironment.execute(StreamContextEnvironment.java:80)
  at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1782)
  at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1765)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:498)
  at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:349)
  ... 11 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
  at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
  at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
  at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
  at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
  at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990)
  at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$1(RestClient.java:430)
  at org.apache.flink.shaded.netty4.io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:577)
  at org.apache.flink.shaded.netty4.io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:570)
  at org.apache.flink.shaded.netty4.io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:549)
  at org.apache.flink.shaded.netty4.io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:490)
  at org.apache.flink.shaded.netty4.io.netty.util.concurrent.DefaultPromise.setValue0(DefaultPromise.java:615)
  at org.apache.flink.shaded.netty4.io.netty.util.concurrent.DefaultPromise.setFailure0(DefaultPromise.java:608)
  at org.apache.flink.shaded.netty4.io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:117)
  at org.apache.flink.shaded.netty4.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.fulfillConnectPromise(AbstractNioChannel.java:321)
  at org.apache.flink.shaded.netty4.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:337)
  at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:702)
  at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
  at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
  at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
  at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
  at org.apache.flink.shaded.netty4.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: xxxxx
  at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:292)
  at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:308)
  at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:957)
  at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
  ... 19 more
Caused by: org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: xxxxx
Caused by: java.net.ConnectException: Connection refused
  at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
  at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:714)
  at org.apache.flink.shaded.netty4.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:330)
  at org.apache.flink.shaded.netty4.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:334)
  at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:702)
  at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
  at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
  at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
  at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
  at org.apache.flink.shaded.netty4.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  at java.lang.Thread.run(Thread.java:748)


After checking the yarn log, it may be the reason for the jar package conflict. After noting the dependency on Hadoop in the POM file, resubmit and run successfully.

[Solved] HBase Error: ERROR: org.apache.hadoop.hbase.PleaseHoldException: Master is initializing

Error reporting

After installing and entering HBase for the first time, you will encounter this problem:

[root@zhiyong2 /]# hbase shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/usdp-srv/srv/udp/2.0.0.0/hdfs/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/usdp-srv/srv/udp/2.0.0.0/hbase/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
HBase Shell
Use "help" to get list of supported commands.
Use "exit" to quit this interactive shell.
For Reference, please visit: http://hbase.apache.org/2.0/book.html#shell
Version 2.1.10, rUnknown, Mon Nov 23 09:56:35 WIB 2020
Took 0.0036 seconds
hbase(main):001:0> list_namespace
NAMESPACE

ERROR: org.apache.hadoop.hbase.PleaseHoldException: Master is initializing
        at org.apache.hadoop.hbase.master.HMaster.checkInitialized(HMaster.java:3003)
        at org.apache.hadoop.hbase.master.HMaster.getNamespaces(HMaster.java:3299)
        at org.apache.hadoop.hbase.master.MasterRpcServices.listNamespaceDescriptors(MasterRpcServices.java:1237)
        at org.apache.hadoop.hbase.shaded.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java)
        at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413)
        at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:133)
        at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
        at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)

For usage try 'help "list_namespace"'

Took 9.8027 seconds

In this case, I can’t wait all the time. I waited for a while and didn’t initialize well…

Solution

Because it is a new cluster, there is no useful data. You can clear the metadata in the following way to reinitialize. Before clearing metadata, close HBase components in USDP’s Web UI.

Delete ZK’s metadata

Since part of the metadata of HBase is stored on zookeeper, so you should so this as below:

[root@zhiyong2 /]# zkCli.sh
Connecting to localhost:2181
2022-03-03 00:09:47,689 [myid:] - INFO  [main:Environment@100] - Client environment:zookeeper.version=3.4.13-2d71af4dbe22557fda74f9a9b4309b15a7487f03, built on 06/29/2018 04:05 GMT
2022-03-03 00:09:47,693 [myid:] - INFO  [main:Environment@100] - Client environment:host.name=zhiyong2
2022-03-03 00:09:47,693 [myid:] - INFO  [main:Environment@100] - Client environment:java.version=1.8.0_202
2022-03-03 00:09:47,695 [myid:] - INFO  [main:Environment@100] - Client environment:java.vendor=Oracle Corporation
2022-03-03 00:09:47,695 [myid:] - INFO  [main:Environment@100] - Client environment:java.home=/usr/java/jdk1.8.0_202/jre
2022-03-03 00:09:47,696 [myid:] - INFO  [main:Environment@100] - Client environment:java.class.path=/srv/udp/2.0.0.0/zookeeper/bin/../zookeeper-server/target/classes:/srv/udp/2.0.0.0/zookeeper/bin/../build/classes:/srv/udp/2.0.0.0/zookeeper/bin/../zookeeper-server/target/lib/*.jar:/srv/udp/2.0.0.0/zookeeper/bin/../build/lib/*.jar:/srv/udp/2.0.0.0/zookeeper/bin/../lib/slf4j-log4j12-1.7.25.jar:/srv/udp/2.0.0.0/zookeeper/bin/../lib/slf4j-api-1.7.25.jar:/srv/udp/2.0.0.0/zookeeper/bin/../lib/netty-3.10.6.Final.jar:/srv/udp/2.0.0.0/zookeeper/bin/../lib/log4j-1.2.17.jar:/srv/udp/2.0.0.0/zookeeper/bin/../lib/jline-0.9.94.jar:/srv/udp/2.0.0.0/zookeeper/bin/../lib/audience-annotations-0.5.0.jar:/srv/udp/2.0.0.0/zookeeper/bin/../zookeeper-3.4.13.jar:/srv/udp/2.0.0.0/zookeeper/bin/../zookeeper-server/src/main/resources/lib/*.jar:/srv/udp/2.0.0.0/zookeeper/bin/../conf:.:/usr/java/jdk1.8.0_202/jre/lib/rt.jar:/usr/java/jdk1.8.0_202/lib/dt.jar:/usr/java/jdk1.8.0_202/lib/tools.jar
2022-03-03 00:09:47,696 [myid:] - INFO  [main:Environment@100] - Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
2022-03-03 00:09:47,696 [myid:] - INFO  [main:Environment@100] - Client environment:java.io.tmpdir=/tmp
2022-03-03 00:09:47,696 [myid:] - INFO  [main:Environment@100] - Client environment:java.compiler=<NA>
2022-03-03 00:09:47,696 [myid:] - INFO  [main:Environment@100] - Client environment:os.name=Linux
2022-03-03 00:09:47,696 [myid:] - INFO  [main:Environment@100] - Client environment:os.arch=amd64
2022-03-03 00:09:47,697 [myid:] - INFO  [main:Environment@100] - Client environment:os.version=3.10.0-957.el7.x86_64
2022-03-03 00:09:47,697 [myid:] - INFO  [main:Environment@100] - Client environment:user.name=root
2022-03-03 00:09:47,697 [myid:] - INFO  [main:Environment@100] - Client environment:user.home=/root
2022-03-03 00:09:47,697 [myid:] - INFO  [main:Environment@100] - Client environment:user.dir=/
2022-03-03 00:09:47,698 [myid:] - INFO  [main:ZooKeeper@442] - Initiating client connection, connectString=localhost:2181 sessionTimeout=30000 watcher=org.apache.zookeeper.ZooKeeperMain$MyWatcher@41906a77
2022-03-03 00:09:47,721 [myid:] - INFO  [main-SendThread(localhost:2181):ClientCnxn$SendThread@1029] - Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
Welcome to ZooKeeper!
JLine support is enabled
2022-03-03 00:09:47,808 [myid:] - INFO  [main-SendThread(localhost:2181):ClientCnxn$SendThread@879] - Socket connection established to localhost/127.0.0.1:2181, initiating session
2022-03-03 00:09:47,817 [myid:] - INFO  [main-SendThread(localhost:2181):ClientCnxn$SendThread@1303] - Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x1000009708f000e, negotiated timeout = 30000

WATCHER::

WatchedEvent state:SyncConnected type:None path:null
[zk: localhost:2181(CONNECTED) 0] ls /
[cluster, brokers, zookeeper, yarn-leader-election, hadoop-ha, admin, isr_change_notification, dolphinscheduler, log_dir_event_notification, controller_epoch, rmstore, consumers, latest_producer_id_block, config, hbase]
[zk: localhost:2181(CONNECTED) 1] rmr /hbase
[zk: localhost:2181(CONNECTED) 2] [root@zhiyong2 /]# ^C
[root@zhiyong2 /]# ^C
[root@zhiyong2 /]#

HDFS metadata deletion

Since part of HBase data is stored in HDFS, it is also necessary to delete the metadata stored in HDFS by HBase. Since USDP has Ranger and actually has users and permissions, you can’t use root to delete important data of HDFS. You must switch to Hadoop user first.

[root@zhiyong2 /]# hadoop fs -rmr /hbase/data/hbase/meta/*
rmr: DEPRECATED: Please use '-rm -r' instead.
rmr: Failed to move to trash: hdfs://zhiyong-1/hbase/data/hbase/meta/.tabledesc: Permission denied: user=root, access=WRITE, inode="/hbase/data/hbase/meta":hadoop:supergroup:drwxr-xr-x
rmr: Failed to move to trash: hdfs://zhiyong-1/hbase/data/hbase/meta/.tmp: Permission denied: user=root, access=WRITE, inode="/hbase/data/hbase/meta":hadoop:supergroup:drwxr-xr-x
rmr: Failed to move to trash: hdfs://zhiyong-1/hbase/data/hbase/meta/1588230740: Permission denied: user=root, access=WRITE, inode="/hbase/data/hbase/meta":hadoop:supergroup:drwxr-xr-x
[root@zhiyong2 /]# cd /etc/passwd/
-bash: cd: /etc/passwd/: 不是目录
[root@zhiyong2 /]# cat /etc/passwd
root:x:0:0:root:/root:/bin/bash
bin:x:1:1:bin:/bin:/sbin/nologin
daemon:x:2:2:daemon:/sbin:/sbin/nologin
adm:x:3:4:adm:/var/adm:/sbin/nologin
lp:x:4:7:lp:/var/spool/lpd:/sbin/nologin
sync:x:5:0:sync:/sbin:/bin/sync
shutdown:x:6:0:shutdown:/sbin:/sbin/shutdown
halt:x:7:0:halt:/sbin:/sbin/halt
mail:x:8:12:mail:/var/spool/mail:/sbin/nologin
operator:x:11:0:operator:/root:/sbin/nologin
games:x:12:100:games:/usr/games:/sbin/nologin
ftp:x:14:50:FTP User:/var/ftp:/sbin/nologin
nobody:x:99:99:Nobody:/:/sbin/nologin
systemd-network:x:192:192:systemd Network Management:/:/sbin/nologin
dbus:x:81:81:System message bus:/:/sbin/nologin
polkitd:x:999:998:User for polkitd:/:/sbin/nologin
libstoragemgmt:x:998:997:daemon account for libstoragemgmt:/var/run/lsm:/sbin/nologin
abrt:x:173:173::/etc/abrt:/sbin/nologin
rpc:x:32:32:Rpcbind Daemon:/var/lib/rpcbind:/sbin/nologin
apache:x:48:48:Apache:/usr/share/httpd:/sbin/nologin
sshd:x:74:74:Privilege-separated SSH:/var/empty/sshd:/sbin/nologin
postfix:x:89:89::/var/spool/postfix:/sbin/nologin
ntp:x:38:38::/etc/ntp:/sbin/nologin
chrony:x:997:995::/var/lib/chrony:/sbin/nologin
tcpdump:x:72:72::/:/sbin/nologin
hadoop:x:1000:1000::/home/hadoop:/bin/bash
mysql:x:27:27:MySQL Server:/var/lib/mysql:/bin/false
saslauth:x:996:76:Saslauthd user:/run/saslauthd:/sbin/nologin
elastic:x:1001:1001::/home/elastic:/bin/bash
hue:x:1002:1002::/home/hue:/bin/bash
[root@zhiyong2 /]# su - hadoop
上一次登录:四 3月  3 00:24:33 CST 2022
[hadoop@zhiyong2 ~]$ hadoop fs -rmr /hbase/data/hbase/meta/*
rmr: DEPRECATED: Please use '-rm -r' instead.
2022-03-03 00:27:31 INFO fs.TrashPolicyDefault: Moved: 'hdfs://zhiyong-1/hbase/data/hbase/meta/.tabledesc' to trash at: hdfs://zhiyong-1/user/hadoop/.Trash/Current/hbase/data/hbase/meta/.tabledesc
2022-03-03 00:27:31 INFO fs.TrashPolicyDefault: Moved: 'hdfs://zhiyong-1/hbase/data/hbase/meta/.tmp' to trash at: hdfs://zhiyong-1/user/hadoop/.Trash/Current/hbase/data/hbase/meta/.tmp
2022-03-03 00:27:31 INFO fs.TrashPolicyDefault: Moved: 'hdfs://zhiyong-1/hbase/data/hbase/meta/1588230740' to trash at: hdfs://zhiyong-1/user/hadoop/.Trash/Current/hbase/data/hbase/meta/1588230740
[hadoop@zhiyong2 ~]$ hadoop fs -rmr /hbase/data/hbase/namespace/*
rmr: DEPRECATED: Please use '-rm -r' instead.
2022-03-03 00:27:34 INFO fs.TrashPolicyDefault: Moved: 'hdfs://zhiyong-1/hbase/data/hbase/namespace/.tabledesc' to trash at: hdfs://zhiyong-1/user/hadoop/.Trash/Current/hbase/data/hbase/namespace/.tabledesc
2022-03-03 00:27:34 INFO fs.TrashPolicyDefault: Moved: 'hdfs://zhiyong-1/hbase/data/hbase/namespace/.tmp' to trash at: hdfs://zhiyong-1/user/hadoop/.Trash/Current/hbase/data/hbase/namespace/.tmp
2022-03-03 00:27:34 INFO fs.TrashPolicyDefault: Moved: 'hdfs://zhiyong-1/hbase/data/hbase/namespace/98fb8a0448305b2f9af4f9a72495b6df' to trash at: hdfs://zhiyong-1/user/hadoop/.Trash/Current/hbase/data/hbase/namespace/98fb8a0448305b2f9af4f9a72495b6df
[hadoop@zhiyong2 ~]$ hadoop fs -rmr /hbase/MasterProcWALs/*
rmr: DEPRECATED: Please use '-rm -r' instead.
2022-03-03 00:27:38 INFO fs.TrashPolicyDefault: Moved: 'hdfs://zhiyong-1/hbase/MasterProcWALs/pv2-00000000000000000009.log' to trash at: hdfs://zhiyong-1/user/hadoop/.Trash/Current/hbase/MasterProcWALs/pv2-00000000000000000009.log
2022-03-03 00:27:38 INFO fs.TrashPolicyDefault: Moved: 'hdfs://zhiyong-1/hbase/MasterProcWALs/pv2-00000000000000000010.log' to trash at: hdfs://zhiyong-1/user/hadoop/.Trash/Current/hbase/MasterProcWALs/pv2-00000000000000000010.log
2022-03-03 00:27:38 INFO fs.TrashPolicyDefault: Moved: 'hdfs://zhiyong-1/hbase/MasterProcWALs/pv2-00000000000000000011.log' to trash at: hdfs://zhiyong-1/user/hadoop/.Trash/Current/hbase/MasterProcWALs/pv2-00000000000000000011.log
2022-03-03 00:27:38 INFO fs.TrashPolicyDefault: Moved: 'hdfs://zhiyong-1/hbase/MasterProcWALs/pv2-00000000000000000012.log' to trash at: hdfs://zhiyong-1/user/hadoop/.Trash/Current/hbase/MasterProcWALs/pv2-00000000000000000012.log
2022-03-03 00:27:38 INFO fs.TrashPolicyDefault: Moved: 'hdfs://zhiyong-1/hbase/MasterProcWALs/pv2-00000000000000000013.log' to trash at: hdfs://zhiyong-1/user/hadoop/.Trash/Current/hbase/MasterProcWALs/pv2-00000000000000000013.log
[hadoop@zhiyong2 ~]$ exit
登出
[root@zhiyong2 /]#

Restart HBase

Available after restart:

[hadoop@zhiyong2 ~]$ hbase shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/usdp-srv/srv/udp/2.0.0.0/hdfs/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/usdp-srv/srv/udp/2.0.0.0/hbase/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
HBase Shell
Use "help" to get list of supported commands.
Use "exit" to quit this interactive shell.
For Reference, please visit: http://hbase.apache.org/2.0/book.html#shell
Version 2.1.10, rUnknown, Mon Nov 23 09:56:35 WIB 2020
Took 0.0045 seconds
hbase(main):001:0> list_namespace
list_namespace          list_namespace_tables
hbase(main):001:0> list_namespace
NAMESPACE
default
hbase
2 row(s)
Took 0.5645 seconds
hbase(main):002:0> exit
[hadoop@zhiyong2 ~]$ exit
登出
[root@zhiyong2 /]# hbase shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/usdp-srv/srv/udp/2.0.0.0/hdfs/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/usdp-srv/srv/udp/2.0.0.0/hbase/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
HBase Shell
Use "help" to get list of supported commands.
Use "exit" to quit this interactive shell.
For Reference, please visit: http://hbase.apache.org/2.0/book.html#shell
Version 2.1.10, rUnknown, Mon Nov 23 09:56:35 WIB 2020
Took 0.0035 seconds
hbase(main):001:0> list_namespace
list_namespace          list_namespace_tables
hbase(main):001:0> list_namespace
NAMESPACE
default
hbase
2 row(s)
Took 0.6270 seconds
hbase(main):002:0> exit
[root@zhiyong2 /]#

At this time, both the root and the root of the HDFS can see whether the root of the HDFS has the highest permission on the HDFS component or not.

Error creating document instance. Cause: org.xml.sax.SAXParseException; lineNumber: 5; columnNumber

Error creating document instance. Cause: org.xml.sax.SAXParseException; lineNumber: 5; columnNumber: 5; 1-byte UTF-8 sequence of byte 1 is invalid.

Cause: org.xml.sax.SAXParseException; lineNumber: 5; columnNumber: 5; 1 byte of UTF-8 sequence byte 1 Invalid. SAXParseException; lineNumber: 5; columnNumber: 5; 1 byte of UTF-8 sequence with byte 1 invalid.
The specific error is as follows:

org.apache.ibatis.exceptions.PersistenceException: 
### Error building SqlSession.
### The error may exist in com/kuang/mapper/UserMapper.java (best guess)
### Cause: org.apache.ibatis.builder.BuilderException: Error parsing SQL Mapper Configuration. Cause: org.apache.ibatis.builder.BuilderException: Error creating document instance.  Cause: org.xml.sax.SAXParseException; lineNumber: 5; columnNumber: 5; 1 字节的 UTF-8 序列的字节 1 无效。

	at org.apache.ibatis.exceptions.ExceptionFactory.wrapException(ExceptionFactory.java:30)
	at org.apache.ibatis.session.SqlSessionFactoryBuilder.build(SqlSessionFactoryBuilder.java:80)
	at org.apache.ibatis.session.SqlSessionFactoryBuilder.build(SqlSessionFactoryBuilder.java:64)
	at MyTest.test(MyTest.java:18)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
	at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
	at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:69)
	at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
	at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:235)
	at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:54)
Caused by: org.apache.ibatis.builder.BuilderException: Error parsing SQL Mapper Configuration. Cause: org.apache.ibatis.builder.BuilderException: Error creating document instance.  Cause: org.xml.sax.SAXParseException; lineNumber: 5; columnNumber: 5; 1 字节的 UTF-8 序列的字节 1 无效。
	at org.apache.ibatis.builder.xml.XMLConfigBuilder.parseConfiguration(XMLConfigBuilder.java:122)
	at org.apache.ibatis.builder.xml.XMLConfigBuilder.parse(XMLConfigBuilder.java:99)
	at org.apache.ibatis.session.SqlSessionFactoryBuilder.build(SqlSessionFactoryBuilder.java:78)
	... 24 more
Caused by: org.apache.ibatis.builder.BuilderException: Error creating document instance.  Cause: org.xml.sax.SAXParseException; lineNumber: 5; columnNumber: 5; 1 字节的 UTF-8 序列的字节 1 无效。
	at org.apache.ibatis.parsing.XPathParser.createDocument(XPathParser.java:263)
	at org.apache.ibatis.parsing.XPathParser.<init>(XPathParser.java:127)
	at org.apache.ibatis.builder.xml.XMLMapperBuilder.<init>(XMLMapperBuilder.java:81)
	at org.apache.ibatis.builder.xml.XMLMapperBuilder.<init>(XMLMapperBuilder.java:76)
	at org.apache.ibatis.builder.annotation.MapperAnnotationBuilder.loadXmlResource(MapperAnnotationBuilder.java:178)
	at org.apache.ibatis.builder.annotation.MapperAnnotationBuilder.parse(MapperAnnotationBuilder.java:118)
	at org.apache.ibatis.binding.MapperRegistry.addMapper(MapperRegistry.java:72)
	at org.apache.ibatis.session.Configuration.addMapper(Configuration.java:841)
	at org.apache.ibatis.builder.xml.XMLConfigBuilder.mapperElement(XMLConfigBuilder.java:388)
	at org.apache.ibatis.builder.xml.XMLConfigBuilder.parseConfiguration(XMLConfigBuilder.java:120)
	... 26 more
Caused by: org.xml.sax.SAXParseException; lineNumber: 5; columnNumber: 5; 1 字节的 UTF-8 序列的字节 1 无效。
	at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(ErrorHandlerWrapper.java:204)
	at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.fatalError(ErrorHandlerWrapper.java:178)
	at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:400)
	at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:306)
	at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(XMLDocumentScannerImpl.java:1029)
	at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:605)
	at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:507)
	at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:867)
	at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:796)
	at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:142)
	at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:247)
	at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:339)
	at org.apache.ibatis.parsing.XPathParser.createDocument(XPathParser.java:261)
	... 35 more
Caused by: com.sun.org.apache.xerces.internal.impl.io.MalformedByteSequenceException: 1 字节的 UTF-8 序列的字节 1 无效。
	at com.sun.org.apache.xerces.internal.impl.io.UTF8Reader.invalidByte(UTF8Reader.java:702)
	at com.sun.org.apache.xerces.internal.impl.io.UTF8Reader.read(UTF8Reader.java:568)
	at com.sun.org.apache.xerces.internal.impl.XMLEntityScanner.load(XMLEntityScanner.java:1895)
	at com.sun.org.apache.xerces.internal.impl.XMLEntityScanner.scanData(XMLEntityScanner.java:1375)
	at com.sun.org.apache.xerces.internal.impl.XMLScanner.scanComment(XMLScanner.java:801)
	at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanComment(XMLDocumentFragmentScannerImpl.java:1036)
	at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(XMLDocumentScannerImpl.java:914)
	... 43 more

The final solution is to transfer all configuration files in mybatis from

<?xml version="1.0" encoding="UTF-8" ?>

to

<?xml version="1.0" encoding="UTF8" ?>

Apache Altas Compile Error: [ERROR] Failed to execute goal on project atlas-testtools: Could not resolve

preface

Recently compiled version of Apache Altas 1.1 . There are many errors in the compilation process. Here you can sort them out and record them.

Error reporting information

[INFO] ------------------< org.apache.atlas:atlas-testtools >------------------
[INFO] Building Apache Atlas Test Utility Tools 1.1.0                    [3/37]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Atlas Server Build Tools 1.0 ................ SUCCESS [  0.431 s]
[INFO] apache-atlas 1.1.0 ................................. SUCCESS [  1.152 s]
[INFO] Apache Atlas Test Utility Tools 1.1.0 .............. FAILURE [  0.117 s]
[INFO] Apache Atlas Integration 1.1.0 ..................... SKIPPED
[INFO] Apache Atlas Common 1.1.0 .......................... SKIPPED
[INFO] Apache Atlas Client 1.1.0 .......................... SKIPPED
[INFO] atlas-client-common 1.1.0 .......................... SKIPPED
[INFO] atlas-client-v1 1.1.0 .............................. SKIPPED
[INFO] Apache Atlas Server API 1.1.0 ...................... SKIPPED
[INFO] Apache Atlas Notification 1.1.0 .................... SKIPPED
[INFO] atlas-client-v2 1.1.0 .............................. SKIPPED
[INFO] Apache Atlas Graph Database Projects 1.1.0 ......... SKIPPED
[INFO] Apache Atlas Graph Database API 1.1.0 .............. SKIPPED
[INFO] Graph Database Common Code 1.1.0 ................... SKIPPED
[INFO] Apache Atlas JanusGraph DB Impl 1.1.0 .............. SKIPPED
[INFO] Apache Atlas Graph Database Implementation Dependencies 1.1.0 SKIPPED
[INFO] Shaded version of Apache hbase client 1.1.0 ........ SKIPPED
[INFO] Shaded version of Apache hbase server 1.1.0 ........ SKIPPED
[INFO] Apache Atlas Authorization 1.1.0 ................... SKIPPED
[INFO] Apache Atlas Repository 1.1.0 ...................... SKIPPED
[INFO] Apache Atlas UI 1.1.0 .............................. SKIPPED
[INFO] Apache Atlas Web Application 1.1.0 ................. SKIPPED
[INFO] Apache Atlas Documentation 1.1.0 ................... SKIPPED
[INFO] Apache Atlas FileSystem Model 1.1.0 ................ SKIPPED
[INFO] Apache Atlas Plugin Classloader 1.1.0 .............. SKIPPED
[INFO] Apache Atlas Hive Bridge Shim 1.1.0 ................ SKIPPED
[INFO] Apache Atlas Hive Bridge 1.1.0 ..................... SKIPPED
[INFO] Apache Atlas Falcon Bridge Shim 1.1.0 .............. SKIPPED
[INFO] Apache Atlas Falcon Bridge 1.1.0 ................... SKIPPED
[INFO] Apache Atlas Sqoop Bridge Shim 1.1.0 ............... SKIPPED
[INFO] Apache Atlas Sqoop Bridge 1.1.0 .................... SKIPPED
[INFO] Apache Atlas Storm Bridge Shim 1.1.0 ............... SKIPPED
[INFO] Apache Atlas Storm Bridge 1.1.0 .................... SKIPPED
[INFO] Apache Atlas Hbase Bridge Shim 1.1.0 ............... SKIPPED
[INFO] Apache Atlas Hbase Bridge 1.1.0 .................... SKIPPED
[INFO] Apache Atlas Kafka Bridge 1.1.0 .................... SKIPPED
[INFO] Apache Atlas Distribution 1.1.0 .................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  2.323 s
[INFO] Finished at: 2022-02-17T15:24:45+05:30
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project atlas-testtools: Could not resolve dependencies for project org.apache.atlas:atlas-testtools:jar:1.1.0: Failed to collect dependencies at org.apache.solr:solr-test-framework:jar:7.0.0 -> org.restlet.jee:org.restlet:jar:2.3.0: Failed to read artifact descriptor for org.restlet.jee:org.restlet:jar:2.3.0: Could not transfer artifact org.restlet.jee:org.restlet:pom:2.3.0 from/to maven-default-http-blocker (http://0.0.0.0/): Blocked mirror for repositories: [maven-restlet (http://maven.restlet.org, default, releases+snapshots), central (http://repo1.maven.org/maven2, default, releases), hortonworks.repo (http://repo.hortonworks.com/content/repositories/releases, default, releases), typesafe (http://repo.typesafe.com/typesafe/releases/, default, releases+snapshots), apache.snapshots (http://repository.apache.org/snapshots, default, snapshots)] -> [Help 1]

Cause of problem:

The solution of package cannot be found in Maven main warehouse (e.g. org.restlet.jee not found)

Solution:

Enter the error reporting project vim test tools/pom.xml

Add the following content at the end of POM file

<repositories>
  <repository>
    <id>maven-restlet</id>
    <name>Restlet repository</name>
    <url>https://maven.restlet.talend.com</url>
  </repository>
</repositories>

Kafka executes the script to create topic error: error org apache. kafka. common. errors. InvalidReplicationFactorException: Replicati

Question:

To test the integration of sparkstreaming and Kafka in the code, you need to create two topics in Kafka in advance, but the following errors are reported during the execution of the creation script

 kafka-topics.sh --zookeeper linux1:2181,linux2:2181,linux3:2181 --create --topic wufabao_topic01 --replication-factor 2 --partitions 3

WARNING: Due to limitations in metric names, topics with a period ('.') or underscore ('_') could collide. To avoid issues it is best to use either, but not both.
Error while executing topic command : Replication factor: 2 larger than available brokers: 0.
[2022-02-09 17:27:18,432] ERROR org.apache.kafka.common.errors.InvalidReplicationFactorException: Replication factor: 2 larger than available brokers: 0.
 (kafka.admin.TopicCommand$)

reason:

The path of metadata stored in zookeeper configured by Kafka is incorrect. The metadata path I configured in Kafka is:
zookeeper connect=linux1:2181,linux2:2181,linux3:2181/myKafka

Solution:

Modify the path of Kafka metadata in the script as follows:
kafka-topics.sh --zookeeper linux1:2181,linux2:2181,linux3:2181/myKafka --create --topic wufabao_topic01 --replication-factor 2 --partitions 3
created successfully