Category Archives: How to Fix

How to Solve Unity Package Android Project Error

1. Background

Unity project has the following capabilities

1. It can run directly on Android phones

2. You can build an APK file and then install it on the mobile phone to run

3. It can be packaged into an Android project and run the project using Android studio

There may be no problem that 1 and 2 can be implemented normally, but an error will be reported when implementing the third method
the first error:

unity UnityEngine.GUIUtility:ProcessEvent (int,intptr,bool&) (at /Users/bokken/buildslave/unity/build/Modules/IMGUI/GUIUtility.cs:189)

 

the second error:

UnityEditor.BuildPlayerWindow Error

 

2. Precautions

1. There is no problem with the configuration of gradle environment, because it can be compiled normally

2. The package directory also has no Chinese name

3. There is no problem with the packaging configuration

 

3. Solutions

Exporting Android Studio project needs to be consistent with our project name of unity

For example: File-BuildSettins-PlayerSetting-Player-ProductName name and File-BuildSettins-Build file directory name consistent can be solved

spark SQL Export Data to Kafka error [How to Solve]

Failed to find data source: kafka. Please deploy the application as per the deployment section of "Structured Streaming + Kafka Integration Guide"

The reason for this error is the lack of spark-sql-kafka-0-10_2.11-2.4.5.jar dependency

Download the jar package, put it on the server, and add it to the submission command

–jars spark-sql-kafka-0-10_2.11-2.4.5.jar

Error is still reported, and error is reported at this time

ommandExec.sideEffectResult(commands.scala:69)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:87)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:177)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:173)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:201)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:198)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:173)
	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:93)
	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:91)
	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:727)
	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:727)
	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1$$anonfun$apply$1.apply(SQLExecution.scala:95)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:144)
	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:86)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:789)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:63)
	at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:727)
	at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:313)
	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:288)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:694)
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.common.serialization.ByteArrayDeserializer
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
	... 45 more

Check the spark directory. There is no kafka-clients jar package

Just add the Kafka-clients dependency package to the submit command

spark-submit --master yarn --deploy-mode cluster --jars spark-sql-kafka-0-10_2.11-2.4.5.jar,kafka-clients-2.0.0.jar

Resubmit and solve the problem

Hive: How to Solve dearby database initialization error

Error Messages:

Metastore connection URL:     jdbc:derby:;databaseName=metastore_db;create=true
Metastore Connection Driver :     org.apache.derby.jdbc.EmbeddedDriver
Metastore connection User:     APP
Starting metastore schema initialization to 3.1.0
Initialization script hive-schema-3.1.0.derby.sql
 
Error: FUNCTION 'NUCLEUS_ASCII' already exists. (state=X0Y68,code=30000)
org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !!
Underlying cause: java.io.IOException : Schema script failed, errorcode 2
Use --verbose for detailed stacktrace.
*** schemaTool failed ***

 

Solution:

Go to share/common/lib/guava-27.0-jre.jar in hadoop
and Replace the lib/guava-27.0-jre.jar file in hive.

[Solved] Servers Install matlab 2018a error: license checkout failed -8

Servers Install matlab 2018a error: license checkout failed -8

 

Solution:

  • Located under the licence folder in the installation path
  • Open the license_standalone.lic file with Notepad
  • Replace “SIGH=” with “TS_OK SIGN=”

Refer to the official website solution

https://www.mathworks.com/matlabcentral/answers/91874-why-do-i-receive-license-manager-error-103

How to Solve Ubuntu18 Compile Kalibr Error (Various Errors)

Opencv-3.4.13 is too new, and errors will be encountered during the compilation of kalibr. The summary is as follows:

Error 1: sudo pip install python-igraph –upgrade failed

Solution:

sudo apt-get install python-igraph

Error 2:

Could not find a package configuration file provided by “code_utils” with
any of the following names:
code_utilsConfig.cmake
code_utils-config.cmake```

Solution:

  1. You need to put the following sentence in the sumpixel_test.cpp file
    #include “backward.hpp”
    to this line
    #include “code_utils/backward.hpp”
  2. Put code_utils into the workspace first, then catkin_ws once, then put imu_utils, then compile

 

Error 3: catkin build -DCMAKE_BUILD_TYPE=Release -j4 errors during compilation

3-1 error reporting:

 error: ‘CV_GRAY2RGB’ was not declared in this scope
     cv::cvtColor(imageCopy1, imageCopy1, CV_GRAY2RGB);
    
 error: ‘CV_TERMCRIT_ITER’ was not declared in this scope
         cv::TermCriteria(CV_TERMCRIT_EPS + CV_TERMCRIT_ITER, 30, 0.1));

error: ‘CV_TERMCRIT_EPS’ was not declared in this scope
         cv::TermCriteria(CV_TERMCRIT_EPS + CV_TERMCRIT_ITER, 30, 0.1));

3-1 solution: add a header file to the corresponding file:

#include <opencv2/imgproc/types_c.h>

##############################################################################################################

3-2 cvstartwindowthread() error:

Change 3-2 to:

cv::startWindowThread()

3-3 CV_LOAD_IMAGE_UNCHANGED error:

3-3 changed to

cv::IMREAD_UNCHANGED

3-4 CV_LOAD_IMAGE_GRAYSCALE error:

3-4 changed to

cv::IMREAD_GRAYSCALE错误:

3-5 CV_ LOAD_ IMAGE_ Grayscale error:

Change 3-5 to

cv::IMREAD_GRAYSCALE

3-6 CV_LOAD_IMAGE_COLOR error:

Change 3-6 to

cv::IMREAD_COLOR

3-7 CV_LOAD_IMAGE_ANYDEPTH error:

3-7 changed to

cv::IMREAD_ANYDEPTH

3-8 CV_MINMAX error:

3-8 changed to

    NORM_MINMAX

3-9 CV_FONT_HERSHEY_SIMPLEX error:

3-9 changed to

cv::FONT_HERSHEY_SIMPLEX```

3-10 CV_WINDOW_AUTOSIZE error:

3-10 changed to

cv::WINDOW_AUTOSIZE

3-11 error: error: aggregate ‘std::ofstream out_t’ has incomplete type and cannot be defined std::ofstream out_t;
3-11 Solution: add the header file as below:

#include <fstream>

PLSQL Startup Error: Initialization error [How to Solve]

preface

After the computer reinstalls vs2022, start PLSQL, and an error is reported, indicating that initialization failed

1. Solution

I installed client 12.2. I found that it requires Microsoft Visual Studio 2013 Redistributable on the official website. After vs2022 upgrade, this toolkit is lost.

Solution:
after downloading Microsoft Visual Studio 2013 redistributable, install it. (you need to uninstall the higher version first)
download address:
64 bit: vcredist_ X64
32-bit: vcredist_ x86

[Solved] vivado Install Error: Xilinx Design Tool Display in Red

Vivado installation error: Xilinx design tool, already exists for 2019.2, specify a different program program group entry

Reason for error: vivado has been installed, and Xilinx Design Tools folder already exists

Solution: find the “Xilinx Design Tools” folder and delete it
Xilinx Design Tools folder path: C:\USER\USERNAME\AppData\Roaming\Microsoft\Windows\Start Menu\Programs

 

[Solved] Linux C++ Compile Error: c++: internal compiler error: Killed (program cc1plus)

Compilation error:

/home/service/rpc/goya-rpc/src/rpc_server_impl.cc: In member function ‘void goya::rpc::RpcServerImpl::OnCallbackDone(google::protobuf::Message*, boost::shared_ptr<boost::asio::basic_stream_socket<boost::asio::ip::tcp> >)’:
/home/service/rpc/goya-rpc/src/rpc_server_impl.cc:101:44: warning: ‘int google::protobuf::MessageLite::ByteSize() const’ is deprecated (declared at /home/service/rpc/goya-rpc/thirdparty/install/include/google/protobuf/message_lite.h:430): Please use ByteSizeLong() instead [-Wdeprecated-declarations]
   int serialized_size = resp_msg->ByteSize();
                                            ^
c++: internal compiler error: Killed (program cc1plus)
Please submit a full bug report,
with preprocessed source if appropriate.
See <http://bugzilla.redhat.com/bugzilla> for instructions.
make[2]: *** [src/CMakeFiles/goya-rpc.dir/rpc_server_impl.cc.o] Error 4
make[1]: *** [src/CMakeFiles/goya-rpc.dir/all] Error 2
make: *** [all] Error 2

The reason for the error is that the compiling machine is running out of memory, and a large number of template extensions need enough memory.

#View linux memory usage by.
1.ps aux --sort -rss
2.free -m
3.top  Press [shift + M keys] to arrange them in reverse order
4.cat /proc/meminfo

Solution:

You can solve this problem by temporarily using swap partitions:

=[step 1: operate as follows]=========================================

Sudo DD if=/dev/zero of=/swapfile bs=64m count=16
\count is the size of the increased swap space. 64M is the block size, so the space size is bs*count=1024mb
sudo mkswap /swapfile \

=[step 2: close release] ==================================================================================

Sudo swapoff /swapfile
sudo RM /swapfile
then continue to perform your relevant operations…

Note: if you still prompt “g++: internal compiler error: killed (program cc1plus)” after creating the temporary space, it may be because the allocated space is not large enough. You can continue to allocate more space.