Mac Docker pull Error: Error response from daemon: Get https://xx.xx.xx.xx/v2/: Service Unavailable

Execute docker pull xx.xx.xx.xx/xx/xx to download the image of the private library. The errors are as follows:

Error response from daemon: Get https://xx.xx.xx.xx/v2/: Service Unavailable

The reason is that docker supports HTTPS protocol by default, while the private library is HTTP protocol.

Mac desktop can be in preferences – & gt; Configure the following code in docker engine. Xx.xx.xx.xx is the address of your private library.

{
    "insecure-registries":[
        "xx.xx.xx.xx"
    ]
}

CentOS system, modify/etc/docker/daemon.json, and add the following code.

{
    "insecure-registries":[
        "xx.xx.xx.xx"
    ]
}

Add here

Openstack virtual machine disk IO error [How to Solve]

1: The client virtual machine has a readonly problem. The initial operation is to remount the disk, but it is invalid.

mount -o remount rw /

2: The customer restarts the virtual machine, and then enters the single user mode. The prompt is fstab. Comment out the VDB in/etc/fstab, and then restart again normally
3: enter the virtual machine to mount the VDB and prompt IO error. It is suspected that there is a problem with the CEPH cluster. Check that the cluster status is normal and the physical disk is normal. Use the following command to repair the disk:

xfs_repair -v -L /dev/vdb
After that, the disk io error is prompted as follows:
[  164.966649] blk_update_request: I/O error, dev vdb, sector 0
[  164.967942] Buffer I/O error on dev vdb, logical block 0, lost async page write
[  164.968885] blk_update_request: I/O error, dev vdb, sector 16
[  164.969652] Buffer I/O error on dev vdb, logical block 2, lost async page write
[  164.970633] Buffer I/O error on dev vdb, logical block 3, lost async page write
[  164.971711] Buffer I/O error on dev vdb, logical block 4, lost async page write
[  164.972838] Buffer I/O error on dev vdb, logical block 5, lost async page write
[  164.974001] Buffer I/O error on dev vdb, logical block 6, lost async page write
[  164.975050] Buffer I/O error on dev vdb, logical block 7, lost async page write
[  164.976249] Buffer I/O error on dev vdb, logical block 8, lost async page write
[  164.977188] Buffer I/O error on dev vdb, logical block 9, lost async page write
[  164.978043] Buffer I/O error on dev vdb, logical block 10, lost async page write
[  164.978930] blk_update_request: I/O error, dev vdb, sector 29360128
[  164.983874] blk_update_request: I/O error, dev vdb, sector 29360208
[  164.992586] blk_update_request: I/O error, dev vdb, sector 29360264
[  165.000701] blk_update_request: I/O error, dev vdb, sector 29366384
[  165.010514] blk_update_request: I/O error, dev vdb, sector 29368784
[  165.018809] blk_update_request: I/O error, dev vdb, sector 29435544
[  165.026458] blk_update_request: I/O error, dev vdb, sector 29452288
[  165.034674] blk_update_request: I/O error, dev vdb, sector 33554432

4: Unmount the disk and try to mount it again by taking a snapshot of the disk and creating a disk by snapshot
snapshot creation fails, prompting that the volume is in read-only mode, as follows:

2021-09-17 18:23:14.541 58 ERROR oslo_messaging.rpc.server   File "/var/lib/kolla/venv/lib/python2.7/site-packages/eventlet/tpool.py", line 148, in proxy_call
2021-09-17 18:23:14.541 58 ERROR oslo_messaging.rpc.server     rv = execute(f, *args, **kwargs)
2021-09-17 18:23:14.541 58 ERROR oslo_messaging.rpc.server   File "/var/lib/kolla/venv/lib/python2.7/site-packages/eventlet/tpool.py", line 129, in execute
2021-09-17 18:23:14.541 58 ERROR oslo_messaging.rpc.server     six.reraise(c, e, tb)
2021-09-17 18:23:14.541 58 ERROR oslo_messaging.rpc.server   File "/var/lib/kolla/venv/lib/python2.7/site-packages/eventlet/tpool.py", line 83, in tworker
2021-09-17 18:23:14.541 58 ERROR oslo_messaging.rpc.server     rv = meth(*args, **kwargs)
2021-09-17 18:23:14.541 58 ERROR oslo_messaging.rpc.server   File "rbd.pyx", line 3484, in rbd.Image.create_snap
2021-09-17 18:23:14.541 58 ERROR oslo_messaging.rpc.server ReadOnlyImage: [errno 30] error creating snapshot snapshot-0aa66c79-6c0f-4735-9f77-da9ade0e11fa from volume-6b3c1c7d-4337-4301-b224-3746686dec05
2021-09-17 18:23:14.541 58 ERROR oslo_messaging.rpc.server

5: View the volume status as follows:

[[email protected] ~]# cinder show 6b3c1c7d-4337-4301-b224-3746686dec05
+--------------------------------+--------------------------------------+
| Property                       | Value                                |
+--------------------------------+--------------------------------------+
| attached_servers               | []                                   |
| attachment_ids                 | []                                   |
| availability_zone              | nova                                 |
| bootable                       | false                                |
| consistencygroup_id            | None                                 |
| created_at                     | 2021-02-04T03:10:53.000000           |
| description                    |                                      |
| encrypted                      | False                                |
| group_id                       | None                                 |
| id                             | 6b3c1c7d-4337-4301-b224-3746686dec05 |
| metadata                       |                                      |
| migration_status               | None                                 |
| multiattach                    | False                                |
| name                           | Public Services-2                           |
| os-vol-host-attr:host          | [email protected]#tencent              |
| os-vol-mig-status-attr:migstat | None                                 |
| os-vol-mig-status-attr:name_id | None                                 |
| os-vol-tenant-attr:tenant_id   | fe2aeba987624a66864f5aa6992c64a7     |
| provider_id                    | None                                 |
| replication_status             | None                                 |
| service_uuid                   | 43314489-49f4-43e3-8d5e-6512a7ab3953 |
| shared_targets                 | False                                |
| size                           | 1024                                 |
| snapshot_id                    | None                                 |
| source_volid                   | None                                 |
| status                         | available                            |
| updated_at                     | 2021-09-17T10:19:15.000000           |
| user_id                        | 6fbcd30ba47b4480ba57f56fd5fa45e0     |
| volume_type                    | tencent                              |
+--------------------------------+--------------------------------------+
Check the volume status via ceph, as follows.
(ceph-mon)[[email protected] /]# rbd info tencent/volume-6b3c1c7d-4337-4301-b224-3746686dec05
rbd image 'volume-6b3c1c7d-4337-4301-b224-3746686dec05':
	size 1 TiB in 262144 objects
	order 22 (4 MiB objects)
	snapshot_count: 1
	id: d301e3babf00f2
	block_name_prefix: rbd_data.d301e3babf00f2
	format: 2
	features: layering, exclusive-lock, object-map, fast-diff, deep-flatten
	op_features: 
	flags: 
	create_timestamp: Thu Feb  4 11:10:53 2021
	access_timestamp: Fri Sep 17 18:58:02 2021
	modify_timestamp: Fri Sep 17 15:33:18 2021
 ###########################################
 (ceph-mon)[[email protected] /]# rbd lock ls tencent/volume-6b3c1c7d-4337-4301-b224-3746686dec05
There is 1 exclusive lock on this image.
Locker          ID                  Address                   
client.98051222 auto 94571522251008 192.168.2.31:0/1375643388  
(ceph-mon)[[email protected] /]# 
ssh to 192.168.2.31 host and found that there is no such process.
need to remove this read/write lock, as follows.
(ceph-mon)[[email protected] /]# rbd lock rm tencent/volume-6b3c1c7d-4337-4301-b224-3746686dec05 "auto 94571522251008" client.98051222
(ceph-mon)[[email protected] /]# rbd lock ls tencent/volume-6b3c1c7d-4337-4301-b224-3746686dec05

6: Mounting the volume again is normal

[Solved] ERROR PythonRunner: Python worker exited unexpectedly (crashed)

Some time ago, I received a private letter from my fans and reported an error when running in pychart. Error Python runner: Python worker exited unexpectedly (crashed)

The test run print (input_rdd. First()) can be printed, but the print (input_rdd. Count()) trigger function will report an error

print(input_rdd.count())

Error Python runner: Python worker exited unexpectedly (crashed) means Python worker exited unexpectedly (crashed)

21/10/24 10:24:48 ERROR PythonRunner: Python worker exited unexpectedly (crashed)
java.net.SocketException: Connection reset by peer: socket write error
	at java.net.SocketOutputStream.socketWrite0(Native Method)
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:95)
	at java.io.DataOutputStream.writeInt(DataOutputStream.java:199)
	at org.apache.spark.api.python.PythonRDD$.writeUTF(PythonRDD.scala:476)
	at org.apache.spark.api.python.PythonRDD$.write$1(PythonRDD.scala:297)
	at org.apache.spark.api.python.PythonRDD$.$anonfun$writeIteratorToStream$1(PythonRDD.scala:307)
	at org.apache.spark.api.python.PythonRDD$.$anonfun$writeIteratorToStream$1$adapted(PythonRDD.scala:307)
	at scala.collection.Iterator.foreach(Iterator.scala:941)
	at scala.collection.Iterator.foreach$(Iterator.scala:941)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
	at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:307)
	at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:621)
	at org.apache.spark.api.python.BasePythonRunner$WriterThread.$anonfun$run$1(PythonRunner.scala:397)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
	at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:232)
21/10/24 10:24:48 ERROR PythonRunner: This may have been caused by a prior exception:
java.net.SocketException: Connection reset by peer: socket write error
	at java.net.SocketOutputStream.socketWrite0(Native Method)
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:95)
	at java.io.DataOutputStream.writeInt(DataOutputStream.java:199)
	at org.apache.spark.api.python.PythonRDD$.writeUTF(PythonRDD.scala:476)
	at org.apache.spark.api.python.PythonRDD$.write$1(PythonRDD.scala:297)
	at org.apache.spark.api.python.PythonRDD$.$anonfun$writeIteratorToStream$1(PythonRDD.scala:307)
	at org.apache.spark.api.python.PythonRDD$.$anonfun$writeIteratorToStream$1$adapted(PythonRDD.scala:307)
	at scala.collection.Iterator.foreach(Iterator.scala:941)
	at scala.collection.Iterator.foreach$(Iterator.scala:941)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
	at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:307)
	at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:621)
	at org.apache.spark.api.python.BasePythonRunner$WriterThread.$anonfun$run$1(PythonRunner.scala:397)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
	at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:232)
21/10/24 10:24:48 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.net.SocketException: Connection reset by peer: socket write error
	at java.net.SocketOutputStream.socketWrite0(Native Method)
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:95)
	at java.io.DataOutputStream.writeInt(DataOutputStream.java:199)
	at org.apache.spark.api.python.PythonRDD$.writeUTF(PythonRDD.scala:476)
	at org.apache.spark.api.python.PythonRDD$.write$1(PythonRDD.scala:297)
	at org.apache.spark.api.python.PythonRDD$.$anonfun$writeIteratorToStream$1(PythonRDD.scala:307)
	at org.apache.spark.api.python.PythonRDD$.$anonfun$writeIteratorToStream$1$adapted(PythonRDD.scala:307)
	at scala.collection.Iterator.foreach(Iterator.scala:941)
	at scala.collection.Iterator.foreach$(Iterator.scala:941)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
	at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:307)
	at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:621)
	at org.apache.spark.api.python.BasePythonRunner$WriterThread.$anonfun$run$1(PythonRunner.scala:397)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
	at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:232)
21/10/24 10:24:48 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0) (LAPTOP-RK2V2UMB executor driver): java.net.SocketException: Connection reset by peer: socket write error
	at java.net.SocketOutputStream.socketWrite0(Native Method)
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:95)
	at java.io.DataOutputStream.writeInt(DataOutputStream.java:199)
	at org.apache.spark.api.python.PythonRDD$.writeUTF(PythonRDD.scala:476)
	at org.apache.spark.api.python.PythonRDD$.write$1(PythonRDD.scala:297)
	at org.apache.spark.api.python.PythonRDD$.$anonfun$writeIteratorToStream$1(PythonRDD.scala:307)
	at org.apache.spark.api.python.PythonRDD$.$anonfun$writeIteratorToStream$1$adapted(PythonRDD.scala:307)
	at scala.collection.Iterator.foreach(Iterator.scala:941)
	at scala.collection.Iterator.foreach$(Iterator.scala:941)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
	at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:307)
	at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:621)
	at org.apache.spark.api.python.BasePythonRunner$WriterThread.$anonfun$run$1(PythonRunner.scala:397)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
	at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:232)

21/10/24 10:24:48 ERROR TaskSetManager: Task 0 in stage 0.0 failed 1 times; aborting job
Traceback (most recent call last):
  File "D:/Code/pycode/exercise/pyspark-study/pyspark-learning/pyspark-day04/main/01_web_analysis.py", line 28, in <module>
    print(input_rdd.first())
  File "D:\opt\Anaconda3-2020.11\lib\site-packages\pyspark\rdd.py", line 1586, in first
    rs = self.take(1)
  File "D:\opt\Anaconda3-2020.11\lib\site-packages\pyspark\rdd.py", line 1566, in take
    res = self.context.runJob(self, takeUpToNumLeft, p)
  File "D:\opt\Anaconda3-2020.11\lib\site-packages\pyspark\context.py", line 1233, in runJob
    sock_info = self._jvm.PythonRDD.runJob(self._jsc.sc(), mappedRDD._jrdd, partitions)
  File "D:\opt\Anaconda3-2020.11\lib\site-packages\py4j\java_gateway.py", line 1304, in __call__
    return_value = get_return_value(
  File "D:\opt\Anaconda3-2020.11\lib\site-packages\py4j\protocol.py", line 326, in get_return_value
    raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.runJob.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (LAPTOP-RK2V2UMB executor driver): java.net.SocketException: Connection reset by peer: socket write error
	at java.net.SocketOutputStream.socketWrite0(Native Method)
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:95)
	at java.io.DataOutputStream.writeInt(DataOutputStream.java:199)
	at org.apache.spark.api.python.PythonRDD$.writeUTF(PythonRDD.scala:476)
	at org.apache.spark.api.python.PythonRDD$.write$1(PythonRDD.scala:297)
	at org.apache.spark.api.python.PythonRDD$.$anonfun$writeIteratorToStream$1(PythonRDD.scala:307)
	at org.apache.spark.api.python.PythonRDD$.$anonfun$writeIteratorToStream$1$adapted(PythonRDD.scala:307)
	at scala.collection.Iterator.foreach(Iterator.scala:941)
	at scala.collection.Iterator.foreach$(Iterator.scala:941)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
	at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:307)
	at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:621)
	at org.apache.spark.api.python.BasePythonRunner$WriterThread.$anonfun$run$1(PythonRunner.scala:397)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
	at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:232)

Driver stacktrace:
	at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2258)
	at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2207)
	at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2206)
	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
	at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2206)
	at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1079)
	at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1079)
	at scala.Option.foreach(Option.scala:407)
	at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1079)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2445)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2387)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2376)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:868)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2196)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2217)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2236)
	at org.apache.spark.api.python.PythonRDD$.runJob(PythonRDD.scala:166)
	at org.apache.spark.api.python.PythonRDD.runJob(PythonRDD.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
	at py4j.Gateway.invoke(Gateway.java:282)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.GatewayConnection.run(GatewayConnection.java:238)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.SocketException: Connection reset by peer: socket write error
	at java.net.SocketOutputStream.socketWrite0(Native Method)
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:95)
	at java.io.DataOutputStream.writeInt(DataOutputStream.java:199)
	at org.apache.spark.api.python.PythonRDD$.writeUTF(PythonRDD.scala:476)
	at org.apache.spark.api.python.PythonRDD$.write$1(PythonRDD.scala:297)
	at org.apache.spark.api.python.PythonRDD$.$anonfun$writeIteratorToStream$1(PythonRDD.scala:307)
	at org.apache.spark.api.python.PythonRDD$.$anonfun$writeIteratorToStream$1$adapted(PythonRDD.scala:307)
	at scala.collection.Iterator.foreach(Iterator.scala:941)
	at scala.collection.Iterator.foreach$(Iterator.scala:941)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
	at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:307)
	at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:621)
	at org.apache.spark.api.python.BasePythonRunner$WriterThread.$anonfun$run$1(PythonRunner.scala:397)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
	at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:232)


Process finished with exit code 1

For the solution to this problem, Xiaobian inquired online. This problem may be caused by many situations. For the current situation that Xiaobian helps solve, spark running locally on Windows system is a software problem. The amount of data is a little large, and errors may be reported when running on pycharm.

Without much nonsense, let’s talk about the solution to the problem of fans. It’s very simple. After pycharm is closed, open it again and run it again. Note that if not, shut down again and run again.

[Solved] Tomcat Error: org.apache.tomcat.util.digester.Digester.fatalError Parse fatal error at line [40] column [36]

Error Messages:

org.apache.tomcat.util.digester.Digester.fatalError Parse fatal error at line [40] column [36]
org.xml.sax.SAXParseException; lineNumber: 40; columnNumber: 36; The value of attribute “password” associated with an element type “user” must not contain the ‘<‘ character.

Solution:
apache-tomcat-10.0.12\conf\tomcat-users

  <user username=”admin” password=”admin” roles=”manager-gui”/>
<user username=”robot” password=”<must-be-changed>” roles=”manager-script”/>
<user username=”tomcat” password=”admin” roles=”tomcat”/>
<user username=”both” password=”<must-be-changed>” roles=”tomcat,role1″/>
<user username=”role1″ password=”<must-be-changed>” roles=”role1″/>

Change password, e.g.
<user username=”robot” password=”admin” roles=”manager-script”/>

[Solved] ERROR Error loading vue.config.js: ERROR Error: Command failed: git describe

Recently, a front-end and back-end separation project was implemented using git clone. A problem was encountered during the front-end startup. Node.js was used    The NPM run serve command and the yarn run serve command will report an error when starting the front-end project. You have tried to download the dependent node that has been installed_ The modules folder and the whole front-end project still reported this problem. Baidu found the error: command failed: git describe — always, and did not find the corresponding article

Later, I finally found the problem and reported the error because of the environment variable problem of GIT

First, this is the wrong git configuration, which leads to the error: command failed: git describe — always problem

The following is how to correctly configure environment variables. Add CMD after the original path

Then, create a new CMD and enter yarn run serve to run normally. The problem I encountered is solved in this way

How to Solve JVM Common Errors: outofmemoryerror & stackoverflowerror

OutOfMemoryError

Error cause: java.lang.outofmemoryerror: Java heap space heap memory overflow
solution: adjust the size of the heap memory

// -Xms1m -Xmx10m -XX:+PrintGCDetails
		List<Object> listObject = new ArrayList<>();
		for (int i = 0; i < 10; i++) {
			System.out.println("i:" + i);
			Byte[] bytes = new Byte[1 * 1024 * 1024];
			listObject.add(bytes);
		}
		System.out.println("Added successfully...");

StackOverflowError

Cause of error: java.lang.stackoverflowerror is expressed as stack overflow, which generally occurs in recursive calls
solution: set the maximum thread call depth, which is 1m by default

//-Xss5m Set the maximum call depth
public class StackTest {
	private static int count;
	public static void count(){
		try {
			count++;
			count(); 
		} catch (Throwable e) {
			System.out.println("the maximum depth:"+count);
			e.printStackTrace();
		}
	}
	public static void main(String[] args) {
			 count();
	}
}

[Solved] Opencv Compile Error: (CMake Error: The following variables are used in this project, but they are set to not)

report errors

On Jetson NX, an error occurred while compiling opencv4.1.1. The error is as follows

CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
CUDA_cublas_LIBRARY (ADVANCED)
    linked by target "opencv_cudev" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudev
    linked by target "opencv_test_cudev" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudev/test
    linked by target "opencv_test_core" in directory /opt/opencv4/opencv-4.1.1/modules/core
    linked by target "opencv_perf_core" in directory /opt/opencv4/opencv-4.1.1/modules/core
    linked by target "opencv_core" in directory /opt/opencv4/opencv-4.1.1/modules/core
    linked by target "opencv_test_cudaarithm" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudaarithm
    linked by target "opencv_cudaarithm" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudaarithm
    linked by target "opencv_cudaarithm" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudaarithm
    linked by target "opencv_perf_cudaarithm" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudaarithm
    linked by target "opencv_flann" in directory /opt/opencv4/opencv-4.1.1/modules/flann
    linked by target "opencv_test_flann" in directory /opt/opencv4/opencv-4.1.1/modules/flann
    linked by target "opencv_perf_imgproc" in directory /opt/opencv4/opencv-4.1.1/modules/imgproc
    linked by target "opencv_test_imgproc" in directory /opt/opencv4/opencv-4.1.1/modules/imgproc
    linked by target "opencv_imgproc" in directory /opt/opencv4/opencv-4.1.1/modules/imgproc
    linked by target "opencv_test_ml" in directory /opt/opencv4/opencv-4.1.1/modules/ml
    linked by target "opencv_ml" in directory /opt/opencv4/opencv-4.1.1/modules/ml
    linked by target "opencv_test_phase_unwrapping" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/phase_unwrapping
    linked by target "opencv_phase_unwrapping" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/phase_unwrapping
    linked by target "opencv_plot" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/plot
    linked by target "opencv_test_quality" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/quality
    linked by target "opencv_quality" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/quality
    linked by target "opencv_test_reg" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/reg
    linked by target "opencv_reg" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/reg
    linked by target "opencv_perf_reg" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/reg
    linked by target "opencv_surface_matching" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/surface_matching
    linked by target "opencv_test_cudafilters" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudafilters
    linked by target "opencv_perf_cudafilters" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudafilters
    linked by target "opencv_cudafilters" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudafilters
    linked by target "opencv_test_cudaimgproc" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudaimgproc
    linked by target "opencv_cudaimgproc" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudaimgproc
    linked by target "opencv_perf_cudaimgproc" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudaimgproc
    linked by target "opencv_test_cudawarping" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudawarping
    linked by target "opencv_cudawarping" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudawarping
    linked by target "opencv_perf_cudawarping" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudawarping
    linked by target "opencv_dnn" in directory /opt/opencv4/opencv-4.1.1/modules/dnn
    linked by target "opencv_perf_dnn" in directory /opt/opencv4/opencv-4.1.1/modules/dnn
    linked by target "opencv_test_dnn" in directory /opt/opencv4/opencv-4.1.1/modules/dnn
    linked by target "opencv_features2d" in directory /opt/opencv4/opencv-4.1.1/modules/features2d
    linked by target "opencv_perf_features2d" in directory /opt/opencv4/opencv-4.1.1/modules/features2d
    linked by target "opencv_test_features2d" in directory /opt/opencv4/opencv-4.1.1/modules/features2d
    linked by target "opencv_test_fuzzy" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/fuzzy
    linked by target "opencv_fuzzy" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/fuzzy
    linked by target "opencv_hfs" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/hfs
    linked by target "opencv_test_img_hash" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/img_hash
    linked by target "opencv_img_hash" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/img_hash
    linked by target "opencv_imgcodecs" in directory /opt/opencv4/opencv-4.1.1/modules/imgcodecs
    linked by target "opencv_perf_imgcodecs" in directory /opt/opencv4/opencv-4.1.1/modules/imgcodecs
    linked by target "opencv_test_imgcodecs" in directory /opt/opencv4/opencv-4.1.1/modules/imgcodecs
    linked by target "opencv_test_line_descriptor" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/line_descriptor
    linked by target "opencv_perf_line_descriptor" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/line_descriptor
    linked by target "opencv_line_descriptor" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/line_descriptor
    linked by target "opencv_test_photo" in directory /opt/opencv4/opencv-4.1.1/modules/photo
    linked by target "opencv_photo" in directory /opt/opencv4/opencv-4.1.1/modules/photo
    linked by target "opencv_perf_photo" in directory /opt/opencv4/opencv-4.1.1/modules/photo
    linked by target "opencv_test_saliency" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/saliency
    linked by target "opencv_saliency" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/saliency
    linked by target "opencv_test_videoio" in directory /opt/opencv4/opencv-4.1.1/modules/videoio
    linked by target "opencv_videoio" in directory /opt/opencv4/opencv-4.1.1/modules/videoio
    linked by target "opencv_perf_videoio" in directory /opt/opencv4/opencv-4.1.1/modules/videoio
    linked by target "opencv_test_xphoto" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/xphoto
    linked by target "opencv_perf_xphoto" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/xphoto
    linked by target "opencv_xphoto" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/xphoto
    linked by target "opencv_calib3d" in directory /opt/opencv4/opencv-4.1.1/modules/calib3d
    linked by target "opencv_perf_calib3d" in directory /opt/opencv4/opencv-4.1.1/modules/calib3d
    linked by target "opencv_test_calib3d" in directory /opt/opencv4/opencv-4.1.1/modules/calib3d
    linked by target "opencv_test_cudacodec" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudacodec
    linked by target "opencv_cudacodec" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudacodec
    linked by target "opencv_perf_cudacodec" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudacodec
    linked by target "opencv_test_cudafeatures2d" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudafeatures2d
    linked by target "opencv_cudafeatures2d" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudafeatures2d
    linked by target "opencv_perf_cudafeatures2d" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudafeatures2d
    linked by target "opencv_test_cudastereo" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudastereo
    linked by target "opencv_cudastereo" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudastereo
    linked by target "opencv_perf_cudastereo" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudastereo
    linked by target "opencv_test_highgui" in directory /opt/opencv4/opencv-4.1.1/modules/highgui
    linked by target "opencv_highgui" in directory /opt/opencv4/opencv-4.1.1/modules/highgui
    linked by target "opencv_test_objdetect" in directory /opt/opencv4/opencv-4.1.1/modules/objdetect
    linked by target "opencv_objdetect" in directory /opt/opencv4/opencv-4.1.1/modules/objdetect
    linked by target "opencv_perf_objdetect" in directory /opt/opencv4/opencv-4.1.1/modules/objdetect
    linked by target "opencv_rgbd" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/rgbd
    linked by target "opencv_test_rgbd" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/rgbd
    linked by target "opencv_test_shape" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/shape
    linked by target "opencv_shape" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/shape
    linked by target "opencv_test_structured_light" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/structured_light
    linked by target "opencv_structured_light" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/structured_light
    linked by target "opencv_text" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/text
    linked by target "opencv_test_text" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/text
    linked by target "opencv_ts" in directory /opt/opencv4/opencv-4.1.1/modules/ts
    linked by target "opencv_video" in directory /opt/opencv4/opencv-4.1.1/modules/video
    linked by target "opencv_perf_video" in directory /opt/opencv4/opencv-4.1.1/modules/video
    linked by target "opencv_test_video" in directory /opt/opencv4/opencv-4.1.1/modules/video
    linked by target "opencv_xfeatures2d" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/xfeatures2d
    linked by target "opencv_perf_xfeatures2d" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/xfeatures2d
    linked by target "opencv_test_xfeatures2d" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/xfeatures2d
    linked by target "opencv_perf_ximgproc" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/ximgproc
    linked by target "opencv_ximgproc" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/ximgproc
    linked by target "opencv_test_ximgproc" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/ximgproc
    linked by target "opencv_xobjdetect" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/xobjdetect
    linked by target "opencv_test_aruco" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/aruco
    linked by target "opencv_aruco" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/aruco
    linked by target "opencv_bgsegm" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/bgsegm
    linked by target "opencv_test_bgsegm" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/bgsegm
    linked by target "opencv_test_bioinspired" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/bioinspired
    linked by target "opencv_perf_bioinspired" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/bioinspired
    linked by target "opencv_bioinspired" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/bioinspired
    linked by target "opencv_ccalib" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/ccalib
    linked by target "opencv_test_cudabgsegm" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudabgsegm
    linked by target "opencv_cudabgsegm" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudabgsegm
    linked by target "opencv_perf_cudabgsegm" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudabgsegm
    linked by target "opencv_test_cudalegacy" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudalegacy
    linked by target "opencv_cudalegacy" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudalegacy
    linked by target "opencv_perf_cudalegacy" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudalegacy
    linked by target "opencv_test_cudaobjdetect" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudaobjdetect
    linked by target "opencv_perf_cudaobjdetect" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudaobjdetect
    linked by target "opencv_cudaobjdetect" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudaobjdetect
    linked by target "opencv_datasets" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/datasets
    linked by target "opencv_dnn_objdetect" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/dnn_objdetect
    linked by target "opencv_dpm" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/dpm
    linked by target "opencv_test_face" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/face
    linked by target "opencv_face" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/face
    linked by target "opencv_test_optflow" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/optflow
    linked by target "opencv_optflow" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/optflow
    linked by target "opencv_perf_optflow" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/optflow
    linked by target "opencv_stitching" in directory /opt/opencv4/opencv-4.1.1/modules/stitching
    linked by target "opencv_test_tracking" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/tracking
    linked by target "opencv_tracking" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/tracking
    linked by target "opencv_perf_tracking" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/tracking
    linked by target "opencv_cudaoptflow" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudaoptflow
    linked by target "opencv_perf_cudaoptflow" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudaoptflow
    linked by target "opencv_test_cudaoptflow" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/cudaoptflow
    linked by target "opencv_test_stereo" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/stereo
    linked by target "opencv_stereo" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/stereo
    linked by target "opencv_perf_stereo" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/stereo
    linked by target "opencv_test_superres" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/superres
    linked by target "opencv_superres" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/superres
    linked by target "opencv_perf_superres" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/superres
    linked by target "opencv_test_videostab" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/videostab
    linked by target "opencv_videostab" in directory /opt/opencv4/opencv_contrib-4.1.1/modules/videostab
    linked by target "opencv_annotation" in directory /opt/opencv4/opencv-4.1.1/apps/annotation
    linked by target "opencv_visualisation" in directory /opt/opencv4/opencv-4.1.1/apps/visualisation
    linked by target "opencv_interactive-calibration" in directory /opt/opencv4/opencv-4.1.1/apps/interactive-calibration
    linked by target "opencv_version" in directory /opt/opencv4/opencv-4.1.1/apps/version

-- Configuring incomplete, errors occurred!
See also "/opt/opencv4/opencv-4.1.1/build/CMakeFiles/CMakeOutput.log".
See also "/opt/opencv4/opencv-4.1.1/build/CMakeFiles/CMakeError.log".

Solution

This is due to the lack of CUDA files.

Since I am CUDA 10.2, I choose cuda-toolkit-10-2 and the corresponding cuda-toolkit-x-x according to CUDA version. We install it directly, and the missing files will be downloaded and installed automatically,

sudo apt-get update
sudo apt-get install cuda-toolkit-10-2

After installation, recompile opnecv.

Error: Please renew the default configurations. [How to Solve]

Problems arising

The current configuration is shown in the following figure:

[SW1]int g0/0/1
[SW1-GigabitEthernet0/0/1]
[SW1-GigabitEthernet0/0/1]dis this
#
interface GigabitEthernet0/0/1
 port link-type trunk
 undo port trunk allow-pass vlan 1
 port trunk allow-pass vlan 2 to 4094
#
return

When the port link type needs to be modified or deleted, an error is reported   Error: please update the default configuration

[SW1-GigabitEthernet0/0/1]
[SW1-GigabitEthernet0/0/1]port link-type access 
Error: Please renew the default configurations.
[SW1-GigabitEthernet0/0/1]
[SW1-GigabitEthernet0/0/1]undo port link-type 
Error: Please renew the default configurations.

Causes of occurrence

The reason for the error is that you have added the port to a VLAN, so you will report an error when changing or deleting it.

For example, if you want to delete “parent folder”, you have to delete “child folder” first. It means to delete it layer by layer. The same is true of Huawei’s configuration commands, which are undo level by level.

|--Parent Folder
|---- subfolder

Solution

To undo all commands except “port link type trunk”, you can change the link type or undo the command.

[SW1-GigabitEthernet0/0/1]dis this
#
interface GigabitEthernet0/0/1
 port link-type trunk
 undo port trunk allow-pass vlan 1
#
return
[SW1-GigabitEthernet0/0/1]
[SW1-GigabitEthernet0/0/1]undo port trunk allow-pass vlan all
[SW1-GigabitEthernet0/0/1]
[SW1-GigabitEthernet0/0/1]port trunk allow-pass vlan 1
[SW1-GigabitEthernet0/0/1]
[SW1-GigabitEthernet0/0/1]dis this
#
interface GigabitEthernet0/0/1
 port link-type trunk
#
return
[SW1-GigabitEthernet0/0/1]
[SW1-GigabitEthernet0/0/1]undo port link-type 
[SW1-GigabitEthernet0/0/1]
[SW1-GigabitEthernet0/0/1]dis this
#
interface GigabitEthernet0/0/1
#
return

[Solved] ERROR org.apache.struts2.dispatcher.Dispatcher – Dispatcher initialization failed

Problem description
the newly created eclipse project cannot run JSP

the console reports an error, as shown in the figure

ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
09:04:55.281 [localhost-startStop-1] ERROR org.apache.struts2.dispatcher.Dispatcher - Dispatcher initialization failed
com.opensymphony.xwork2.config.ConfigurationException: Unable to load configuration.
	at com.opensymphony.xwork2.config.ConfigurationManager.getConfiguration(ConfigurationManager.java:70) ~[xwork-core-2.3.37.jar:2.3.37]
	at org.apache.struts2.dispatcher.Dispatcher.getContainer(Dispatcher.java:978) ~[struts2-core-2.3.37.jar:2.3.37]
	at org.apache.struts2.dispatcher.Dispatcher.init_PreloadConfiguration(Dispatcher.java:446) ~[struts2-core-2.3.37.jar:2.3.37]
	at org.apache.struts2.dispatcher.Dispatcher.init(Dispatcher.java:490) [struts2-core-2.3.37.jar:2.3.37]
	at org.apache.struts2.dispatcher.ng.InitOperations.initDispatcher(InitOperations.java:74) [struts2-core-2.3.37.jar:2.3.37]
	at org.apache.struts2.dispatcher.ng.filter.StrutsPrepareAndExecuteFilter.init(StrutsPrepareAndExecuteFilter.java:57) [struts2-core-2.3.37.jar:2.3.37]
	at org.apache.catalina.core.ApplicationFilterConfig.initFilter(ApplicationFilterConfig.java:281) [catalina.jar:7.0.107]
	at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:262) [catalina.jar:7.0.107]
	at org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:106) [catalina.jar:7.0.107]
	at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4973) [catalina.jar:7.0.107]
	at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5672) [catalina.jar:7.0.107]
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183) [catalina.jar:7.0.107]
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1689) [catalina.jar:7.0.107]
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1679) [catalina.jar:7.0.107]
	at java.util.concurrent.FutureTask.run(Unknown Source) [?:1.8.0_51]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [?:1.8.0_51]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [?:1.8.0_51]
	at java.lang.Thread.run(Unknown Source) [?:1.8.0_51]
Caused by: com.opensymphony.xwork2.config.ConfigurationException: Action class [Action.complexAction] not found
	at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.verifyAction(XmlConfigurationProvider.java:486) ~[xwork-core-2.3.37.jar:2.3.37]
	at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.addAction(XmlConfigurationProvider.java:429) ~[xwork-core-2.3.37.jar:2.3.37]
	at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.addPackage(XmlConfigurationProvider.java:556) ~[xwork-core-2.3.37.jar:2.3.37]
	at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadPackages(XmlConfigurationProvider.java:295) ~[xwork-core-2.3.37.jar:2.3.37]
	at org.apache.struts2.config.StrutsXmlConfigurationProvider.loadPackages(StrutsXmlConfigurationProvider.java:112) ~[struts2-core-2.3.37.jar:2.3.37]
	at com.opensymphony.xwork2.config.impl.DefaultConfiguration.reloadContainer(DefaultConfiguration.java:264) ~[xwork-core-2.3.37.jar:2.3.37]
	at com.opensymphony.xwork2.config.ConfigurationManager.getConfiguration(ConfigurationManager.java:67) ~[xwork-core-2.3.37.jar:2.3.37]
	... 17 more
十月 22, 2021 9:04:55 上午 org.apache.catalina.core.StandardContext filterStart
Severe: Start filter exception
Unable to load configuration. - action - file:/F:/JavaEE/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/Week7/WEB-INF/classes/struts.xml:14:64
	at org.apache.struts2.dispatcher.Dispatcher.init(Dispatcher.java:504)
	at org.apache.struts2.dispatcher.ng.InitOperations.initDispatcher(InitOperations.java:74)
	at org.apache.struts2.dispatcher.ng.filter.StrutsPrepareAndExecuteFilter.init(StrutsPrepareAndExecuteFilter.java:57)
	at org.apache.catalina.core.ApplicationFilterConfig.initFilter(ApplicationFilterConfig.java:281)
	at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:262)
	at org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:106)
	at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4973)
	at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5672)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1689)
	at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1679)
	at java.util.concurrent.FutureTask.run(Unknown Source)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.lang.Thread.run(Unknown Source)
Caused by: Unable to load configuration. - action - file:/F:/JavaEE/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/Week7/WEB-INF/classes/struts.xml:14:64
	at com.opensymphony.xwork2.config.ConfigurationManager.getConfiguration(ConfigurationManager.java:70)
	at org.apache.struts2.dispatcher.Dispatcher.getContainer(Dispatcher.java:978)
	at org.apache.struts2.dispatcher.Dispatcher.init_PreloadConfiguration(Dispatcher.java:446)
	at org.apache.struts2.dispatcher.Dispatcher.init(Dispatcher.java:490)
	... 14 more
Caused by: Action class [Action.complexAction] not found - action - file:/F:/JavaEE/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/Week7/WEB-INF/classes/struts.xml:14:64
	at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.verifyAction(XmlConfigurationProvider.java:486)
	at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.addAction(XmlConfigurationProvider.java:429)
	at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.addPackage(XmlConfigurationProvider.java:556)
	at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadPackages(XmlConfigurationProvider.java:295)
	at org.apache.struts2.config.StrutsXmlConfigurationProvider.loadPackages(StrutsXmlConfigurationProvider.java:112)
	at com.opensymphony.xwork2.config.impl.DefaultConfiguration.reloadContainer(DefaultConfiguration.java:264)
	at com.opensymphony.xwork2.config.ConfigurationManager.getConfiguration(ConfigurationManager.java:67)
	... 17 more

There are many solutions to online search, such as jar package conflict in Tomcat and reinstalling tomcat, but other projects can still run before, so there is no blind operation. Some things still need to see the essence. In fact, where the problem appears has been prompted.

Caused by: com.opensymphony.xwork2.config.ConfigurationException: Action class [Action.complexAction] not found
	at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.verifyAction(XmlConfigurationProvider.java:486) ~[xwork-core-2.3.37.jar:2.3.37]
	at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.addAction(XmlConfigurationProvider.java:429) ~[xwork-core-2.3.37.jar:2.3.37]
	at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.addPackage(XmlConfigurationProvider.java:556) ~[xwork-core-2.3.37.jar:2.3.37]
	at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadPackages(XmlConfigurationProvider.java:295) ~[xwork-core-2.3.37.jar:2.3.37]
	at org.apache.struts2.config.StrutsXmlConfigurationProvider.loadPackages(StrutsXmlConfigurationProvider.java:112) ~[struts2-core-2.3.37.jar:2.3.37]
	at com.opensymphony.xwork2.config.impl.DefaultConfiguration.reloadContainer(DefaultConfiguration.java:264) ~[xwork-core-2.3.37.jar:2.3.37]
	at com.opensymphony.xwork2.config.ConfigurationManager.getConfiguration(ConfigurationManager.java:67) ~[xwork-core-2.3.37.jar:2.3.37]

Yes, complexaction was not found in the action configuration
there is a problem with the action configuration in the struts.xml file. If you also encounter this problem, first see if there is a problem with the configuration file

Xilinx Vitis Error Launching Program: Memory write error MMU section translation fault

Run As -> Launch Hardware (Single Application Debug (GDB))  Error:
Error while launching program:
Memory write error at 0x100000. MMU section translation fault

The reason is that the JP4 interface is mistakenly connected to SD, but it can be connected to JTAG and the Program Device can be successfully connected.

 

NPM install Error: gyp ERR! stack Error: Could not find any Python installation to use

When the Vue project uses NPM install as a dependency, the following error is reported:

gyp verb command rebuild []
gyp verb command clean []
gyp verb clean removing "build" directory
gyp verb command configure []
gyp verb find Python Python is not set from command line or npm configuration
gyp verb find Python Python is not set from environment variable PYTHON
gyp verb find Python checking if "python3" can be used
gyp verb find Python - executing "python3" to get executable path
gyp verb find Python - "python3" is not in PATH or produced an error
gyp verb find Python checking if "python" can be used
gyp verb find Python - executing "python" to get executable path
gyp verb find Python - "python" is not in PATH or produced an error
gyp verb find Python checking if "python2" can be used
gyp verb find Python - executing "python2" to get executable path
gyp verb find Python - "python2" is not in PATH or produced an error
gyp verb find Python checking if Python is C:\Python37\python.exe
gyp verb find Python - executing "C:\Python37\python.exe" to get version
gyp verb find Python - "C:\Python37\python.exe" could not be run
gyp verb find Python checking if Python is C:\Python27\python.exe
gyp verb find Python - executing "C:\Python27\python.exe" to get version
gyp verb find Python - "C:\Python27\python.exe" could not be run
gyp verb find Python checking if the py launcher can be used to find Python
gyp verb find Python - executing "py.exe" to get Python executable path
gyp verb find Python - "py.exe" is not in PATH or produced an error
gyp ERR! find Python
gyp ERR! find Python Python is not set from command line or npm configuration
gyp ERR! find Python Python is not set from environment variable PYTHON
gyp ERR! find Python checking if "python3" can be used
gyp ERR! find Python - "python3" is not in PATH or produced an error
gyp ERR! find Python checking if "python" can be used
gyp ERR! find Python - "python" is not in PATH or produced an error
gyp ERR! find Python checking if "python2" can be used
gyp ERR! find Python - "python2" is not in PATH or produced an error
gyp ERR! find Python checking if Python is C:\Python37\python.exe
gyp ERR! find Python - "C:\Python37\python.exe" could not be run
gyp ERR! find Python checking if Python is C:\Python27\python.exe
gyp ERR! find Python - "C:\Python27\python.exe" could not be run
gyp ERR! find Python checking if the py launcher can be used to find Python
gyp ERR! find Python - "py.exe" is not in PATH or produced an error
gyp ERR! find Python
gyp ERR! find Python **********************************************************
gyp ERR! find Python You need to install the latest version of Python.
gyp ERR! find Python Node-gyp should be able to find and use Python. If not,
gyp ERR! find Python you can try one of the following options:
gyp ERR! find Python - Use the switch --python="C:\Path\To\python.exe"
gyp ERR! find Python   (accepted by both node-gyp and npm)
gyp ERR! find Python - Set the environment variable PYTHON
gyp ERR! find Python - Set the npm configuration variable python:
gyp ERR! find Python   npm config set python "C:\Path\To\python.exe"
gyp ERR! find Python For more information consult the documentation at:
gyp ERR! find Python https://github.com/nodejs/node-gyp#installation
gyp ERR! find Python **********************************************************
gyp ERR! find Python
gyp ERR! configure error
gyp ERR! stack Error: Could not find any Python installation to use
gyp ERR! stack     at PythonFinder.fail (E:\project\DBApi-master\dbapi-ui\node_modules\node-gyp\lib\find-python.js:302:47)
gyp ERR! stack     at PythonFinder.runChecks (E:\project\DBApi-master\dbapi-ui\node_modules\node-gyp\lib\find-python.js:136:21)
gyp ERR! stack     at PythonFinder.<anonymous> (E:\project\DBApi-master\dbapi-ui\node_modules\node-gyp\lib\find-python.js:200:18)
gyp ERR! stack     at PythonFinder.execFileCallback (E:\project\DBApi-master\dbapi-ui\node_modules\node-gyp\lib\find-python.js:266:16)
gyp ERR! stack     at exithandler (child_process.js:390:5)
gyp ERR! stack     at ChildProcess.errorhandler (child_process.js:402:5)
gyp ERR! stack     at ChildProcess.emit (events.js:400:28)
gyp ERR! stack     at Process.ChildProcess._handle.onexit (internal/child_process.js:280:12)     
gyp ERR! stack     at onErrorNT (internal/child_process.js:469:16)
gyp ERR! stack     at processTicksAndRejections (internal/process/task_queues.js:82:21)
gyp ERR! System Windows_NT 10.0.19042
gyp ERR! command "D:\\nodejs\\node.exe" "E:\\project\\DBApi-master\\dbapi-ui\\node_modules\\node-gyp\\bin\\node-gyp.js" "rebuild" "--verbose" "--libsass_ext=" "--libsass_cflags=" "--libsass_ldflags=" "--libsass_library="
gyp ERR! cwd E:\project\DBApi-master\dbapi-ui\node_modules\node-sass
gyp ERR! node -v v14.18.1
gyp ERR! node-gyp -v v7.1.2
gyp ERR! not ok
Build failed with error code: 1
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected]~2.3.2 (node_modules\chokidar\node_modules\fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"win32","arch":"x64"})
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected]^1.2.7 (node_modules\watchpack-chokidar2\node_modules\chokidar\node_modules\fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"win32","arch":"x64"})
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected]^1.2.7 (node_modules\webpack-dev-server\node_modules\chokidar\node_modules\fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"win32","arch":"x64"})

npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! [email protected] postinstall: `node scripts/build.js`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the [email protected] postinstall script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.

npm ERR! A complete log of this run can be found in:
npm ERR!     C:\Users\WRD\AppData\Roaming\npm-cache\_logs\2021-10-20T05_42_34_767Z-debug.log 

This is due to the lack of Python dependencies. After downloading and installing on the python official website, delete the dependencies and re execute NPM install