Category Archives: How to Fix

Successfully resolved error c3861: ‘printf’: identifier not found

Problem Description:

The error is as follows: error c3861: ‘printf’: identifier not found.

resolvent

The prompt indicates that the identifier “printf” cannot be found. I guess this may be caused by the compiler’s failure to include the header file stdio. H. then I wrote a new line #include & lt; stdio.h>, After that, the program will not report an error.

IntelliJ idea error: Java: compilation failed: solution to internal java compiler error

Method 1: it may be due to heap memory problems

In setting — & gt; Build,Execution,Deployment–> Build process heap size (Mbytes) found in compiler: 700 changed to 1024 (as appropriate)

Method 2: it may be due to project coding

PS: I can compile it by changing it to GBK. The system default GBK format is invalid and needs to be manually changed to GBK

Sslcertverificationerror when downloading using Python: [SSL: certificate_verify_failed] error

When downloading the dataset on the pytorch official website, an error is reported in sslcertverificationerror: [SSL: certificate_verify_failed]. The following is my solution.

Error reason: when opening HTTPS link with urllib, SSL certificate will be checked once. When the target website uses a self signed certificate, it will throw the error of urlib2.urlerror.

Solution: cancel certificate validation globally
Import SSL
SSL_ create_ default_ https_ context = ssl._ create_ unverified_ context

Reference article link: https://blog.csdn.net/yixieling4397/article/details/79861379

org.apache.flink.runtime.client.JobSubmissionException: Failed to submit JobGraph

An error is reported when the Flink SQL client submits the job:

2021-10-21 15:23:54,232 INFO  org.apache.flink.yarn.YarnClusterDescriptor                  [] - No path for the flink jar passed. Using the location of class org.apache.flink.yarn.YarnClusterDescriptor to locate the jar
2021-10-21 15:23:54,233 WARN  org.apache.flink.yarn.YarnClusterDescriptor                  [] - Neither the HADOOP_CONF_DIR nor the YARN_CONF_DIR environment variable is set.The Flink YARN Client needs one of these to be set to properly load the Hadoop configuration for accessing YARN.
2021-10-21 15:23:54,291 INFO  org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider [] - Failing over to rm2
2021-10-21 15:23:54,337 INFO  org.apache.flink.yarn.YarnClusterDescriptor                  [] - Found Web Interface n101:36989 of application 'application_1634635118307_0001'.
[ERROR] Could not execute SQL statement. Reason:
org.apache.flink.runtime.client.JobSubmissionException: Failed to submit JobGraph.

Failing over to rm2

Only active namenode can be submitted successfully. My active namenode is RM2

RT-thread assertion failed at function:rt_application_init

Today, after I used message queue, mailbox, ADC and multiple serial port threads at the same time, the system produced a series of errors.

The error contents are as follows:


After continuous attempts, I found a solution, which is recorded here.

Reduce the thread stack size of main thread and fishsh shell thread to 1024 (the value is for reference only).


Integrated development environment: RT thread studio

MCU:STM32G030C8

Solving environment: failed solution to the problem encountered when updating Anaconda

Problem:
in the problem of handling Anaconda’s Navigator
involves upgrading the navigator
command: CONDA update Anaconda navigator

solution:
Add Anaconda Python free warehouse:

conda config –add channels https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free/
  conda config –set show_ channel_ URLs yes
then upgrade again. (the first one already exists and will be prompted.)

Failed to parse mapping [_doc]: Root mapping definition has unsupported parameters:

Question content:

{
    "error": {
        "root_cause": [
            {
                "type": "mapper_parsing_exception",
                "reason": "Root mapping definition has unsupported parameters:  [article : {properties={id={index=not_analyzed, store=true, type=long}, title={analyzer=standard, index=analyzed, store=true, type=text}, content={analyzer=standard, index=analyzed, store=true, type=text}}}]"
            }
        ],
        "type": "mapper_parsing_exception",
        "reason": "Failed to parse mapping [_doc]: Root mapping definition has unsupported parameters:  [article : {properties={id={index=not_analyzed, store=true, type=long}, title={analyzer=standard, index=analyzed, store=true, type=text}, content={analyzer=standard, index=analyzed, store=true, type=text}}}]",
        "caused_by": {
            "type": "mapper_parsing_exception",
            "reason": "Root mapping definition has unsupported parameters:  [article : {properties={id={index=not_analyzed, store=true, type=long}, title={analyzer=standard, index=analyzed, store=true, type=text}, content={analyzer=standard, index=analyzed, store=true, type=text}}}]"
        }
    },
    "status": 400
}

Wrong configuration

{
  "mappings": {
    "article": {
      "properties": {
        "id": {
          "type": "long",
          "store": true,
          "index":"not_analyzed"
        },
        "title": {
          "type": "text",
          "store": true,
          "index":"analyzed",
          "analyzer":"standard"
        },
        "content": {
          "type": "text",
          "store": true,
          "index":"analyzed",
          "analyzer":"standard"
        } 
      }
    } 
  }
}

Modify configuration

{
  "mappings": {
      "properties": {
        "id": {
          "type": "long",
          "store": true
        },
        "title": {
          "type": "text",
          "store": true,
          "analyzer":"standard"
        },
        "content": {
          "type": "text",
          "store": true,
          "analyzer":"standard"
        } 
      }
  }
}

Cause of problem: ES7 adjustment for configuration

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed

When NPM runs build, an error is reported: fatal error: ineffective mark compacts near heap limit allocation failed. Most of the answers are found on the Internet:
NPM install – G increase memory limit
enter the project directory and execute:
increase memory limit
this configuration is invalid. Then he said he wanted to put node_ Remove the double quotation marks in “XXX” under the modules folder (XXX forgot something). It’s also invalid
the last feasible solution:
open the CMD window and enter the command: setx node_ OPTIONS –max_ old_ space_ size=10240。 Then close the command window and rerun NPM run build. success!!!