Category Archives: Error

Springboot connects to the database error: testWhileIdle is true, validationQuery not set

Problem Description:

  Use springboot to connect to the database, an error is reported when starting: testWhileIdle is true, validationQuery not set. But it does not affect the use of the system, and all access to the database is normal.

  The application.properties data source configuration is as follows:

  spring.datasource.username=root
  spring.datasource.password=
  spring.datasource.driver-class-name=org.mariadb.jdbc.Driver
  spring.datasource.initial-size=1
  spring.datasource.maximum-pool-size=10
  spring.datasource.connection-timeout=5000

 

Solution:

springboot1.4 cancel spring.datasource.type

  In the new version, the validationQuery is not automatically injected, so you need to manually declare the datasource bean. As for how to write, you can read the above article.

  It is mainly to manually inject DruidDataSource, declare a configuration class, inject all values ​​of the data source into it, and return to DruidDataSource.

  Restart, the system does not report an error!

  Finally, I checked the DruidDataSource source code,

  public static final String DEFAULT_VALIDATION_QUERY = null;

  Sure enough, the DruidDataSource default testWhileIdle=true, and the validationQuery is set to empty.

Kafka error: ERROR There was an error in one of the threads during logs loading: java.lang.NumberFormatException: For input string: “derby” (kafka.log.LogManager)

1. Make a   note of Kafka error handling

 

After Kafka was stopped, an error occurred when restarting :

[2017-10-27 09:43:18,313] INFO Recovering unflushed segment 15000679 in log mytest-0. (kafka.log.Log)

[2017-10-27 09:43:18,972] ERROR There was an error in one of the threads during logs loading: java.lang.NumberFormatException: For input string: “derby” (kafka.log.LogManager)

[2017-10-27 09:43:18,975] FATAL [Kafka Server 0], Fatal error during KafkaServer startup. Prepare to shutdown (kafka.server.KafkaServer)

java.lang.NumberFormatException: For input string: “derby”

        at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)

        at java.lang.Long.parseLong (Long.java:589)

        at java.lang.Long.parseLong (Long.java:631)

        at scala.collection.immutable.StringLike$class.toLong(StringLike.scala:277)

        at scala.collection.immutable.StringOps.toLong(StringOps.scala:29)

        at kafka.log.Log$.offsetFromFilename(Log.scala:1648)

        at kafka.log.Log $$ anonfun $ loadSegmentFiles $ 3.apply (Log.scala: 284)

        at kafka.log.Log $$ anonfun $ loadSegmentFiles $ 3.apply (Log.scala: 272)

        at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)

 

Looking directly at the error log, you can see from the log that there is an obvious error:

ERROR There was an error in one of the threads during logs loading: java.lang.NumberFormatException: For input string: “derby” (kafka.log.LogManager) 

From the original meaning, it can be seen that there is a thread that made an error while loading the log, java.lang.NumberFormatException throws an exception, and the input string derby has a problem.

What the hell? ?

First, let’s analyze what to do when kafka restarts: When the
kafka broker is started, the data of each topic before it will be reloaded, and under normal circumstances, it will prompt that each topic is restored.

INFO Recovering unflushed segment 8790240 in log userlog-2. (kafka.log.Log)

INFO Loading producer state from snapshot file 00000000000008790240.snapshot for partition userlog-2 (kafka.log.ProducerStateManager)

INFO Loading producer state from offset 10464422 for partition userlog-2 with message format version 2 (kafka.log.Log)

INFO Loading producer state from snapshot file 00000000000010464422.snapshot for partition userlog-2 (kafka.log.ProducerStateManager)

INFO Completed load of log userlog-2 with 2 log segments, log start offset 6223445 and log end offset 10464422 in 4460 ms (kafka.log.Log)

 

But when the data recovery under some topics fails, it will cause the broker to shut down, and an error will be reported:
ERROR There was an error in one of the threads during logs loading: java.lang.NumberFormatException: For input string: “derby” (kafka .log.LogManager)

Now it is clear that the problem lies in the topic data. What is the problem? ?

Quickly go to the place where kafka stores the topic, this path is set in server.properties:

log.dirs=/data/kafka/kafka-logs

) From the previous line of the error log:

 It can be seen that the problem occurred when loading the topic mytest-0. Go directly to the directory where this topic is located, and find that there is an illegal file called derby.log. Delete it directly and restart the service.

) Check it completely and make sure that there is no similar document

#cd /data/kafka/kafka-logs

#find /data/kafka/kafka-logs/ -name “derby*”

You can see that there is a derby.log file under topic, mytest-0, which is illegal. Because kafka broker requires all data file names to be of type Long Just delete this file and restart kafka.

 

2.    Remember a kafka, zookeeper error report

Both    Kafka and zookeeper started normally, but from the log, it was disconnected soon after being connected. The error message is as follows:

[2017-10-27 15:06:08,981] INFO Established session 0x15f5c88c014000a with negotiated timeout 240000 for client /127.0.0.1:33494 (org.apache.zookeeper.server.ZooKeeperServer)

[2017-10-27 15:06:08,982] INFO Processed session termination for sessionid: 0x15f5c88c014000a (org.apache.zookeeper.server.PrepRequestProcessor)

[2017-10-27 15:06:08,984] WARN caught end of stream exception (org.apache.zookeeper.server.NIOServerCnxn)

EndOfStreamException: Unable to read additional data from client sessionid 0x15f5c88c014000a, likely client has closed socket

        at org.apache.zookeeper.server.NIOServerCnxn.doIO (NIOServerCnxn.java:239)

at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:203)

at java.lang.Thread.run(Thread.java:745)

 

From the literal meaning of the log, the first log: said that session 0x15f5c88c014000a timed out after 240 seconds (what the hell?); continue to the second log and said 0x15f5c88c014000a This session ended, and the timeout caused the session to be disconnected. This is Understand; Ok, let’s look at the third item: You can’t read additional data from the 0x15f5c88c014000a session. (All disconnected, how to read). So far log analysis is complete, it seems that the session timeout disconnection. Just increase the connection time of the session.

The configured timeout is too short, Zookeeper has not finished reading the data of Consumer , and the connection is disconnected by Consumer !

 

Solution:

Modify kafka’s server.properties file:

# Timeout in ms for connecting to zookeeper

zookeeper.connection.timeout.ms=600000

zookeeper.session.timeout.ms=400000

 

Generally it is fine. If you are not at ease, change the zookeeper configuration file:

# disable the per-ip limit on the number of connections since this is a non-production config

maxClientCnxns=1000

tickTime = 120000

 

How to Solve Maven Error: Return code is: 501 , ReasonPhrase:HTTPS Required.

When using jenkins to build today, the following error was reported

  [ERROR] Failed to execute goal on project saas20: Could not resolve dependencies for project com.ipower365.saas:saas20:war:0.0.1-SNAPSHOT: Failed to collect dependencies at com.ipower365.saas:messageserviceimpl:jar:0.0.1-SNAPSHOT -> com.ipower365.boss:nacha:jar:1.0.1: Failed to read artifact descriptor for com.ipower365.boss:nacha:jar:1.0.1: Could not transfer artifact com.ipower365.boss:nacha:pom:1.0.1 from/to central (http://repo1.maven.org/maven2/):Failed to transfer file:http://repo1.maven.org/maven2/com/ipower365/boss/nacha/1.0.1/nacha-1.0.1.pom. Return code is: 501 , ReasonPhrase:HTTPS Required. -> [Help 1]

We found that this dependent file is available in the local warehouse, but during the build process, after downloading the file in the local nexus, it will still request the file download like the central warehouse.

    [echoing saas20] Downloading from central: http://repo1.maven.org/maven2/com/ipower365/boss/nacha/1.0.1/nacha-1.0.1.pom

After that, we searched for the problem based on the returned 501 error. The reference link is as follows:

https://stackoverflow.com/questions/59763531/maven-dependencies-are-failing-with-501-error

 

 

As mentioned above, since January 15, 2020, the central repository no longer supports insecure communication via pure HTTP, and requires all requests to the repository to be encrypted via HTTPS.

So we added the following configuration to the settings file that we relied on during the build process:

1
2
3
4
5
6
<mirror>
<id>central</id>
<name>Maven Repository Switchboard</name>
<url>https://repo1.maven.org/maven2/</url>
<mirrorOf>central</mirrorOf>
</mirror>

But the problem is still not resolved, and then an error is reported, the error is as follows:

    Could not transfer artifact com.ipower365.boss:nacha:pom:1.0.1 from/to central (https://repo1.maven.org/maven2/):Received fatal alert: protocol_version -> [Help 1]

This is when using the https protocol to request the central warehouse, the protocol version needs to be specified, and then the following parameters are added when building, the reference link is as follows: 

https://stackoverflow.com/questions/50824789/why-am-i-getting-received-fatal-alert-protocol-version-or-peer-not-authentic

1
-Dhttps.protocols=TLSv1.2

Then when you build again, the request is passed!

Reason: Our Java environment uses 7 and 8, while our mvn version uses 3.5.x.

Therefore, when using mvn packaging in the JAVA8 environment, the above parameters do not need to be specified, but when using the JAVA7 environment, the above error will occur. Later, we will consider updating the version of mvn and unified JAVA environment

[Solved] Virtualenvwrapper.sh error: There was a problem running the initialization hooks.

In the ubuntu environment, I made a soft link between python and python3.6 (ln -s python python3.6), and I made a soft link with pip, so after installing virtualenvwrapper with pip, start virtualenvwrapper at source. When sh and workon virtual environment always report an error:

1 ./virtualenvwrapper.sh: line 230: : command not found
2 virtualenvwrapper.sh: There was a problem running the initialization hooks.
3 
4 If Python could not import the module virtualenvwrapper.hook_loader,
5 check that virtualenvwrapper has been installed for
6 VIRTUALENVWRAPPER_PYTHON= and that PATH is
7 set properly.

This is the statement on line 230 according to the hint:

1 "$VIRTUALENVWRAPPER_PYTHON" -m 'virtualenvwrapper.hook_loader' \

Combined with the error message and the sentence found in the prompt, it is guessed that there should be a problem with VIRTUALENVWRAPPER_PYTHON, and then look for VIRTUALENVWRAPPER_PYTHON in the virtualenvwrapper.sh file, and find the key points:

1 # Locate the global Python where virtualenvwrapper is installed.
 2  if [ " ${VIRTUALENVWRAPPER_PYTHON:-} " = "" ]
 3  then
 4      VIRTUALENVWRAPPER_PYTHON = " $(command \which python3) " # It was originally written \which python, here The one posted is after I modified it to python3. 
5 fi

VIRTUALENVWRAPPER_PYTHON is used (Locate the global Python where virtualenvwrapper is installed.) to locate which python has virtualenvwrapper installed. The location originally specified is python, which is version 2.7. Since I installed it using python3.6 before, I need to change it to python3 here. Then the error disappeared.


The advantage of using virtualenvwrapper is that you don’t need to use source /xxx/virtual environment/bin/activate every time to start the virtual environment. Configure it in ~/.bashrc. You can directly use the workon command to open the virtual environment in the future. The specific steps and premise You have installed python-virtualenv:

# Setup:
#  1. Create a directory to hold the virtual environments.
#     (mkdir $HOME/.virtualenvs).
#  2. Add a line like "export WORKON_HOME=$HOME/.virtualenvs"
#     to your .bashrc.
#  3. Add a line like "source /path/to/this/file/virtualenvwrapper.sh"
#     to your .bashrc.
#  4. Run: source ~/.bashrc
#  5. Run: workon
#  6. A list of environments, empty, is printed.
#  7. Run: mkvirtualenv temp
#  8. Run: workon
#  9. This time, the "temp" environment is included.
# 10. Run: workon temp
# 11. The virtual environment is activated.

Ionic3 update open apk android 8.0 error FileUriExposedException

Android compulsory update in the project, when the file is downloaded. The apk package cannot be opened in android 8.0.

Introduce the plug-in to report the error

import { FileOpener } from '@ionic-native/file-opener';

constructor(private fileOpener: FileOpener) { }

...

this.fileOpener.open('path/to/file.pdf', 'application/pdf')
  .then(() => console.log('File is opened'))
  . catch (e => console.log('Error opening file', e));

 

FileUriExposedException: file:///storage/emulated/0/test.txt exposed beyond app through Intent.getData()

========================================

According to cordova’s official website, android 8.0 needs to add a configuration file.

Android APK installation restrictions
When opening the APK file for installation, the following restrictions apply:

In Android 8 + on, your application must have ACTION_INSTALL_PACKAGE permission. You can add it by adding it to your application config.xml file:
     <platform name= " android " >
        <config-file parent="/manifest" target="AndroidManifest.xml" xmlns:android="http://schemas.android.com/apk/res/android">
            <uses-permission android:name="android.permission.REQUEST_INSTALL_PACKAGES" />
        </config-file>
    </platform> 
Before Android 7, you could only install APKs from the "external" partition. For example, you can install cordova.file.externalDataDirectory from it, but not cordova.file.dataDirectory from it. Android 7 + does not have this limitation.

After adding the above content, it still reports an error.

 

Solution:

The above is the answer of the great god. Finally modify in AndroidManifest.xml

1
<uses-sdk android:minSdkVersion="16" android:targetSdkVersion="23" />

 The default configuration of ionic3 is 16-26. After many trials, only the direct sdk version of 16-23 is supported. The higher the version, the above error will be reported.

Solve the problem of packaging again! ! !

Log jar package conflict error: Class path contains multiple SLF4J bindings

Description of the problem: Tomcat is stuck at startup, and the error log is as follows: 

November 07, 2017 8:35:45 PM org.apache.catalina.core.ApplicationContext log
Information: Initializing Spring root WebApplicationContext
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file: /E:/workspace/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/wlf-notify/WEB-INF/lib/log4j-slf4j- impl-2.9.1.jar!/org/slf4j/impl/StaticLoggerBinder. class ]
SLF4J: Found binding in [jar:file: /E:/workspace/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/wlf-notify/WEB-INF/lib/slf4j-log4j12- 1.7.2.jar!/org/slf4j/impl/StaticLoggerBinder. class ]
SLF4J: See http: // www.slf4j.org/codes.html#multiple_bindings for an explanation. 
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

  Problem location: From the log, I see that the two log jar packages conflict: log4j-slf4j-impl-2.9.1.jar and slf4j-log4j12-1.7.2.jar   

  Solve the problem:

  1. Simple solution: According to the local tomcat path given in the log, enter the lib, delete one of them, and enter E:/workspace/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/ wlf-notify/WEB-INF/lib, delete log4j-slf4j-impl-2.9.1.jar, and restart.

  2. The fundamental solution: through maven’s dependency tree analysis, and then through execluse to eliminate conflicting dependencies in pom.xml.

How to Solve Import antd Error: Module build failed

1. Create a project using the create-react-app tool

1
create-react-app antd-demo

2. Install babel-plugin-import

1
npm install babel-plugin-import --dev

3. Quote antd on demand

Introduced in App.js,

1
import { Button } from 'antd';

Configure babel in package.json

1
2
3
4
5
6
7
8
9
10
11
"babel": {
   "plugins": [
     [
       "import",
       {
         "libraryName""antd",
         "style"true
       }
     ]
   ]
 },

Finally, an error was reported when the project started, and the error message is as follows

1
2
3
4
5
6
7
8
./node_modules/antd/lib/button/style/index.less
Module build failed:
// https://github.com/ant-design/ant-motion/issues/44
.bezierEasingMixin();
^
Inline JavaScript is not enabled. Is it set in your options?
      in E:\webstrom\migu\ngoc\web\react-interface\react-interface-cli\node_modules\antd\lib\style\color\bezierEasing.less (line 110, column 0)

Finally put “style”: “css” on it

The style here can be true or’css’, but I don’t know why I use true and I get an error.

babel-plugin-import configuration, options can be an array

1
2
3
  "plugins":[["import",options]] 
}

Import the js module:

1
["import", { "libraryName""antd" }]

Import js and css modules (LESS/Sass source files):

1
["import", { "libraryName""antd""style"true }]

Import js and css modules (css built-in files):

1
["import", { "libraryName""antd""style""css" }] 

[Solved] Springboot Startup Error: Consider defining a bean of type ‘XXX’ in your configuration.

I. Introduction:

The project was built using springboot automatic injection, but an error was reported when it started! ! ! , The error is as shown below:

Description: Field userEntityMapper in com.xxx.xxx.service.UserService required a bean of

 type'com.xxx.xxx.dao.UserEntityMapper' that could not be found.
 
Action: Consider defining a bean of

 type'com.xxx.xxx .dao.UserEntityMapper' in your configuration.

2. Solution:

1. Check whether the annotations you wrote are wrong, and not.

2. Find solutions on the Internet: as follows:

step one:

  Add in the configuration file of springboot, the configuration of mybatis is as follows:

mybatis: 
  typeAliasesPackage: com.xxx.xxx.dao.entity 
  mapperLocations: classpath:mapper/*.xml

 Step two:

  ① Put the interface and the corresponding implementation class in the same directory or its subdirectory as the application startup class , so that the annotations can be scanned, which is the most trouble-free way. (Not tested)
  ② Or add @MapperScan or @ComponentScan annotations to the startup class , and manually specify the annotations under which packages the application class should scan, as shown below: 

@SpringBootApplication
 @ComponentScan(basePackages = {"com.xxx.xxx.dao"})

③Or add @Mapper annotation   on the interface .

@Mapper
 public  interface UserMapper { 
}

ps: The reason why the corresponding Bean was not found is because @SpringBootApplication did not scan it.

SpringBoot uses MyBatis error: Error invoking SqlProvider method (tk.mybatis.mapper.provider.base.BaseInsertProvider.dynamicSQL)

1. Error description

  Use SpringBoot to integrate the MyBatis framework, and use mapper-spring-boot-starter to automatically generate MyBatis mapper files, and use mybatis-generator-core to generate MyBatis mapping files.

  SpringBoot version: 2.0.0.RELEASE

  mybatis-spring-boot-starter version: 1.3.2

  mapper-spring-boot-starter version: 1.2.4

  mybatis-generator-core version: 1.3.6

2. Error message

org.mybatis.spring.MyBatisSystemException: nested exception is org.apache.ibatis.builder.BuilderException: Error invoking SqlProvider method (tk.mybatis.mapper.provider.base.BaseInsertProvider.dynamicSQL). Cause: java.lang.InstantiationException: tk .mybatis.mapper.provider.base.BaseInsertProvider
    at org.mybatis.spring.MyBatisExceptionTranslator.translateExceptionIfPossible(MyBatisExceptionTranslator.java: 77 )
    at org.mybatis.spring.SqlSessionTemplate$SqlSessionInterceptor.invoke(SqlSessionTemplate.java: 446 )
    at com.sun.proxy.$Proxy79.insert(Unknown Source)
    at org.mybatis.spring.SqlSessionTemplate.insert(SqlSessionTemplate.java: 278 )
    at org.apache.ibatis.binding.MapperMethod.execute(MapperMethod.java: 58 )
    at org.apache.ibatis.binding.MapperProxy.invoke(MapperProxy.java: 59 )
    at com.sun.proxy.$Proxy80.insert(Unknown Source)
    at com.imooc.sevice.impl.SysUserServiceImpl.saveUser(SysUserServiceImpl.java: 19 )
    at com.imooc.controller.SysUserController.saveUser(SysUserController.java: 35 )
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: 62 )
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java: 43 )
    at java.lang.reflect.Method.invoke(Method.java: 498 )
    at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java: 209 )
    at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java: 136 )
    at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java: 102 )
    at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java: 870 )
    at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java: 776 )
    at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java: 87 )
    at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java: 991 )
    at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java: 925 )
    at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java: 978 )
    at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java: 881 )
    at javax.servlet.http.HttpServlet.service(HttpServlet.java: 661 )
    at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java: 855 )
    at javax.servlet.http.HttpServlet.service(HttpServlet.java: 742 )
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java: 231 )
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java: 166 )
    at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java: 52 )
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java: 193 )
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java: 166 )
    at com.alibaba.druid.support.http.WebStatFilter.doFilter(WebStatFilter.java: 123 )
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java: 193 )
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java: 166 )
    at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java: 99 )
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java: 107 )
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java: 193 )
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java: 166 )
    at org.springframework.web.filter.HttpPutFormContentFilter.doFilterInternal(HttpPutFormContentFilter.java: 109 )
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java: 107 )
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java: 193 )
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java: 166 )
    at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java: 81 )
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java: 107 )
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java: 193 )
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java: 166 )
    at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java: 200 )
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java: 107 )
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java: 193 )
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java: 166 )
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java: 199 )
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java: 96 )
    at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java: 496 )
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java: 140 )
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java: 81 )
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java: 87 )
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java: 342 )
    at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java: 803 )
    at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java: 66 )
    at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java: 790 )
    at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java: 1459 )
    at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java: 49 )
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java: 1149 )
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java: 624 )
    at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java: 61 )
    at java.lang.Thread.run(Thread.java: 748 )
Caused by: org.apache.ibatis.builder.BuilderException: Error invoking SqlProvider method (tk.mybatis.mapper.provider.base.BaseInsertProvider.dynamicSQL). Cause: java.lang.InstantiationException: tk.mybatis.mapper.provider.base .BaseInsertProvider
    at org.apache.ibatis.builder.annotation.ProviderSqlSource.createSqlSource(ProviderSqlSource.java: 134 )
    at org.apache.ibatis.builder.annotation.ProviderSqlSource.getBoundSql(ProviderSqlSource.java: 102 )
    at org.apache.ibatis.mapping.MappedStatement.getBoundSql(MappedStatement.java: 292 )
    at org.apache.ibatis.executor.statement.BaseStatementHandler. <init>(BaseStatementHandler.java:64 )
    at org.apache.ibatis.executor.statement.PreparedStatementHandler. <init>(PreparedStatementHandler.java:40 )
    at org.apache.ibatis.executor.statement.RoutingStatementHandler. <init>(RoutingStatementHandler.java:46 )
    at org.apache.ibatis.session.Configuration.newStatementHandler(Configuration.java: 558 )
    at org.apache.ibatis.executor.SimpleExecutor.doUpdate(SimpleExecutor.java: 48 )
    at org.apache.ibatis.executor.BaseExecutor.update(BaseExecutor.java: 117 )
    at org.apache.ibatis.executor.CachingExecutor.update(CachingExecutor.java: 76 )
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: 62 )
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java: 43 )
    at java.lang.reflect.Method.invoke(Method.java: 498 )
    at org.apache.ibatis.plugin.Plugin.invoke(Plugin.java: 63 )
    at com.sun.proxy.$Proxy95.update(Unknown Source)
    at org.apache.ibatis.session.defaults.DefaultSqlSession.update(DefaultSqlSession.java: 198 )
    at org.apache.ibatis.session.defaults.DefaultSqlSession.insert(DefaultSqlSession.java: 185 )
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: 62 )
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java: 43 )
    at java.lang.reflect.Method.invoke(Method.java: 498 )
    at org.mybatis.spring.SqlSessionTemplate$SqlSessionInterceptor.invoke(SqlSessionTemplate.java: 433 )
    ... 64 more
Caused by: java.lang.InstantiationException: tk.mybatis.mapper.provider.base.BaseInsertProvider
    at java.lang.Class.newInstance(Class.java: 427 )
    at org.apache.ibatis.builder.annotation.ProviderSqlSource.invokeProviderMethod(ProviderSqlSource.java: 165 )
    at org.apache.ibatis.builder.annotation.ProviderSqlSource.createSqlSource(ProviderSqlSource.java: 116 )
    ... 86 more
Caused by: java.lang.NoSuchMethodException: tk.mybatis.mapper.provider.base.BaseInsertProvider. <init> ()
    at java.lang.Class.getConstructor0(Class.java: 3082 )
    at java.lang.Class.newInstance(Class.java: 412 )
    ... 88 more

3. Problem

  On the startup class of SpringBoot, the wrong package was introduced when using @MapperScan annotation.

  The correct one should be: import tk.mybatis.spring.annotation.MapperScan;

  Wrongly introduced: import org.mybatis.spring.annotation.MapperScan;

How to Solve Springmvc Error: org.springframework.web.servlet.DispatcherServlet

When writing springMVC, after importing all required packages, run the program, the console reports the following error:

Critical: Servlet [springDispatcherServlet] in web application [/SpringMVC-1 ] threw load() exception
java.lang.ClassNotFoundException: org.springframework.web.servlet.DispatcherServlet
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java: 1333 )
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java: 1167 )
at org.apache.catalina.core.DefaultInstanceManager.loadClass(DefaultInstanceManager.java: 518 )
at org.apache.catalina.core.DefaultInstanceManager.loadClassMaybePrivileged(DefaultInstanceManager.java: 499 )
at org.apache.catalina.core.DefaultInstanceManager.newInstance(DefaultInstanceManager.java: 118 )
at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java: 1091 )
at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java: 1027 )
at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java: 5038 )
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java: 5348 )
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java: 145 )
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java: 725 )
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java: 701 )
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java: 717 )
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java: 587 )
at org.apache.catalina.startup.HostConfig$DeployDescriptor.run(HostConfig.java: 1798 )
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)

You can’t import org.springframework.web.servlet.DispatcherServlet in a java file. This class is mainly under the spring-webmvc jar package. All imported jar packages are as follows:

All have been imported, there is no problem. But in fact, all dependencies must be added to the classpath. step:

Right-click the project–>properties–>Deployment Assembly–>add–>Java Build Path Entries–>import all dependent Jar packages and restart tomcat.

Mysql5.7.18.1 Error 1054 (42S22): Unknown Column’password’ In’field List’ When Changing User Password

This intention is to modify a user’s password, the command found on the Internet is as follows

mysql> update user set password=password(“new password”) where user=”username”;

ERROR 1054(42S22) Unknown column’password’ in’field list’ is reported after execution

The reason for the error is that there is no password field in the mysql database under version 5.7, and the password field is changed to authentication_string

So please use the following command:

>mysql -u root -p
Enter password: ********
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 12
Server version: 5.7.18-log MySQL Community Server (GPL)

Copyright (c) 2000, 2017, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> use mysql;
Database changed
mysql > select User from user; #here is the query user command
 +-----------+
| User      |
+-----------+
| ******* |
| mysql.sys |
| root      |
+-----------+
3 rows in set (0.00 sec)

mysql > update user set password=password( " ******* " ) where user = " ******* " ; #Modify password error
ERROR 1054 (42S22): Unknown column 'password' in 'field list'
mysql> update mysql.user set authentication_string=password('*******') where user='*******';  #修改密码成功
Query OK, 1 row affected, 1 warning (0.00 sec)
Rows matched: 1  Changed: 1  Warnings: 1

mysql > flush privileges; #effective immediately
Query OK, 0 rows affected (0.00 sec)

mysql> quit
Bye

n >mysql -u *******- p #Log in as this user successfully.
Enter password: ********
…………………………
mysql>

Hadoop Error: hdfs.DFSClient: Exception in createBlockOutputStream

java.io.EOFException: Premature EOF: no length prefix available
at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:1492)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1155)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1088)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)
15/03/24 18:26:40 INFO hdfs.DFSClient: Abandoning BP-1909118226-192.168.19.234-1427110524238:blk_1073762363_21550
15/03/24 18:26:40 INFO hdfs.DFSClient: Excluding datanode 192.168.21.24:50010
copy from: /root/zenggq/jn2/data2w/t0.head_2000 to /recom1000/t0.head_2000
15/03/24 18:26:41 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.EOFException: Premature EOF: no length prefix available
at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:1492)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1155)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1088)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)
15/03/24 18:26:41 INFO hdfs.DFSClient: Abandoning BP-1909118226-192.168.19.234-1427110524238:blk_1073762365_21552
15/03/24 18:26:41 INFO hdfs.DFSClient: Excluding datanode 192.168.21.23:50010
15/03/24 18:26:41 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Bad connect ack with firstBadLink as 192.168.21.24:50010
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1166)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1088)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)
15/03/24 18:26:41 INFO hdfs.DFSClient: Abandoning BP-1909118226-192.168.19.234-1427110524238:blk_1073762366_21553
15/03/24 18:26:41 INFO hdfs.DFSClient: Excluding datanode 192.168.21.24:50010
15/03/24 18:26:41 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.EOFException: Premature EOF: no length prefix available
at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:1492)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1155)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1088)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)
15/03/24 18:26:41 INFO hdfs.DFSClient: Abandoning BP-1909118226-192.168.19.234-1427110524238:blk_1073762367_21554
15/03/24 18:26:41 INFO hdfs.DFSClient: Excluding datanode 192.168.19.236:50010
15/03/24 18:26:41 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.EOFException: Premature EOF: no length prefix available
at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:1492)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1155)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1088)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)
15/03/24 18:26:41 INFO hdfs.DFSClient: Abandoning BP-1909118226-192.168.19.234-1427110524238:blk_1073762368_21555
15/03/24 18:26:41 INFO hdfs.DFSClient: Excluding datanode 192.168.21.30:50010
15/03/24 18:26:41 WARN hdfs.DFSClient: DataStreamer Exception
java.io.IOException: Unable to create new block.
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1100)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)
15/03/24 18:26:41 WARN hdfs.DFSClient: Could not get block locations. Source file “/recom1000/t1.head_2000” – Aborting…
Exception in thread “main” java.io.EOFException: Premature EOF: no length prefix available
at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:1492)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1155)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1088)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)
15/03/24 18:26:41 ERROR hdfs.DFSClient: Failed to close file /recom1000/t1.head_2000
java.io.EOFException: Premature EOF: no length prefix available
at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:1492)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1155)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1088)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)
[root@master jn2]#

I found the answer in a circle. One said that the datanode process did not exist, but that the firewall was not turned off. It turned out that I had no problem with both.

Later, I deleted the data directory under hadoop-dir. Then reformat the namenode

hadoop purpose -format

And then it’s ready