[Solved] Git Clone Error: error setting certificate verify locations

Description

When using git clone to clone an item on GitHub or gitee, the following error is reported:

error setting certificate verify locations:

CAfile: E:/Git/mingw64/ssl/certs/ca-bundle. crt

CApath: none

analysis

According to the error prompt, there is an error in setting the certificate verification location, that is, the certificate file path is wrong.

When cloning a remote project, the security certificate will be verified first. If the local security certificate file cannot be found, an error will be reported.

This is why this error will not be reported when cloning projects on gitlab, because gitlab is generally built on the intranet and does not need to verify the security certificate.

Path errors often occur because the local Git is installed green, that is, it is directly extracted and used.

In this way, the path of the certificate file is still the path on the original machine. If the path of the new machine is inconsistent, the path error will be caused.

Solution:

For the above analysis, there are two solutions:

  • Modify certificate file path (recommended)
  • Turn off certificate verification

Turning off certificate verification may cause security problems. It is recommended to modify the certificate file path.

Modify certificate file path

There are two ways:

  • Execute the configuration command (recommended)
  • Modify the configuration file

The essence of these two methods is to modify the configuration file. However, some misoperations may occur when modifying the file, and the operation is more cumbersome. It is recommended to execute the configuration command.

Execute configuration command

git config --system http.sslcainfo "Git安装路径/mingw64/ssl/certs/ca-bundle.crt"

Modify profile

Git’s system configuration files are located at: git installation path \etc\gitconfig

Modify the path in the file as shown in the figure to git installation path /mingw64/ssl/certs/ca-bundle.crt save again.

Turn off certificate verification

git config --system http.sslverify false

This method may cause git security problems and is not recommended.

Failed to Initialize Error: error execution phase preflight: [preflight] Some fatal errors occurred: [ERROR Port-6443]

[[email protected] ~]# kubeadm init --config=kubeadm-config.yaml --experimental-upload-certs | tee kubeadm-init.log
Flag --experimental-upload-certs has been deprecated, use --upload-certs instead
[init] Using Kubernetes version: v1.15.1
[preflight] Running pre-flight checks
    [WARNING SystemVerification]: this Docker version is not on the list of validated versions: 20.10.11. Latest validated version: 18.09
error execution phase preflight: [preflight] Some fatal errors occurred:
    [ERROR Port-6443]: Port 6443 is in use
    [ERROR Port-10251]: Port 10251 is in use
    [ERROR Port-10252]: Port 10252 is in use
    [ERROR FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml]: /etc/kubernetes/manifests/kube-apiserver.yaml already exists
    [ERROR FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml]: /etc/kubernetes/manifests/kube-controller-manager.yaml already exists
    [ERROR FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml]: /etc/kubernetes/manifests/kube-scheduler.yaml already exists
    [ERROR FileAvailable--etc-kubernetes-manifests-etcd.yaml]: /etc/kubernetes/manifests/etcd.yaml already exists
    [ERROR Port-10250]: Port 10250 is in use
    [ERROR Port-2379]: Port 2379 is in use
    [ERROR Port-2380]: Port 2380 is in use
    [ERROR DirAvailable--var-lib-etcd]: /var/lib/etcd is not empty
[preflight] If you know what you are doing, you can make a check non-fatal with `--ignore-preflight-errors=...`
Reason:
Restart kubeadm after modifying the kubeadm-config.yaml file, otherwise the port from the previous startup is occupied.
Solution:
Result test:
The k8s cluster was initialized successfully.
[[email protected] ~]# kubeadm init –config kubeadm-config.yaml –ignore-preflight-errors=SystemVerific

 

[Solved] Error downloading standard development IDs for MRPC. You will need to manually split your data.

Reason for error reporting

  • The original download link of the MRPC dataset is invalid, the content of TASK2PATH is deleted, and two new links MRPC_TRAIN and MRPC_TEST are replaced.
  • Splitting the dataset requires a mapping file, which cannot be obtained from the original download link.

Solution:

1. comment this code

2. download the file and import it into the MRPC folder: dev_ ids. Tsv

    3. rerun code
''' Script for downloading all GLUE data.
Note: for legal reasons, we are unable to host MRPC.
You can either use the version hosted by the SentEval team, which is already tokenized,
or you can download the original data from (https://download.microsoft.com/download/D/4/6/D46FF87A-F6B9-4252-AA8B-3604ED519838/MSRParaphraseCorpus.msi) and extract the data from it manually.
For Windows users, you can run the .msi file. For Mac and Linux users, consider an external library such as 'cabextract' (see below for an example).
You should then rename and place specific files in a folder (see below for an example).
mkdir MRPC
cabextract MSRParaphraseCorpus.msi -d MRPC
cat MRPC/_2DEC3DBE877E4DB192D17C0256E90F1D | tr -d $'\r' > MRPC/msr_paraphrase_train.txt
cat MRPC/_D7B391F9EAFF4B1B8BCE8F21B20B1B61 | tr -d $'\r' > MRPC/msr_paraphrase_test.txt
rm MRPC/_*
rm MSRParaphraseCorpus.msi
'''

import os
import sys
import shutil
import argparse
import tempfile
import urllib
import io
if sys.version_info >= (3, 0):
    import urllib.request
import zipfile

URLLIB=urllib
if sys.version_info >= (3, 0):
    URLLIB=urllib.request

TASKS = ["CoLA", "SST", "MRPC", "QQP", "STS", "MNLI", "QNLI", "RTE", "WNLI", "diagnostic"]
TASK2PATH = {"CoLA":'https://dl.fbaipublicfiles.com/glue/data/CoLA.zip',
             "SST":'https://dl.fbaipublicfiles.com/glue/data/SST-2.zip',
             "QQP":'https://dl.fbaipublicfiles.com/glue/data/STS-B.zip',
             "STS":'https://dl.fbaipublicfiles.com/glue/data/QQP-clean.zip',
             "MNLI":'https://dl.fbaipublicfiles.com/glue/data/MNLI.zip',
             "QNLI":'https://dl.fbaipublicfiles.com/glue/data/QNLIv2.zip',
             "RTE":'https://dl.fbaipublicfiles.com/glue/data/RTE.zip',
             "WNLI":'https://dl.fbaipublicfiles.com/glue/data/WNLI.zip',
             "diagnostic":'https://dl.fbaipublicfiles.com/glue/data/AX.tsv'}

MRPC_TRAIN = 'https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_train.txt'
MRPC_TEST = 'https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_test.txt'

def download_and_extract(task, data_dir):
    print("Downloading and extracting %s..." % task)
    if task == "MNLI":
        print("\tNote (12/10/20): This script no longer downloads SNLI. You will need to manually download and format the data to use SNLI.")
    data_file = "%s.zip" % task
    URLLIB.urlretrieve(TASK2PATH[task], data_file)
    with zipfile.ZipFile(data_file) as zip_ref:
        zip_ref.extractall(data_dir)
    os.remove(data_file)
    print("\tCompleted!")

def format_mrpc(data_dir, path_to_data):
    print("Processing MRPC...")
    mrpc_dir = os.path.join(data_dir, "MRPC")
    if not os.path.isdir(mrpc_dir):
        os.mkdir(mrpc_dir)
    if path_to_data:
        mrpc_train_file = os.path.join(path_to_data, "msr_paraphrase_train.txt")
        mrpc_test_file = os.path.join(path_to_data, "msr_paraphrase_test.txt")
    else:
        try:
            mrpc_train_file = os.path.join(mrpc_dir, "msr_paraphrase_train.txt")
            mrpc_test_file = os.path.join(mrpc_dir, "msr_paraphrase_test.txt")
            URLLIB.urlretrieve(MRPC_TRAIN, mrpc_train_file)
            URLLIB.urlretrieve(MRPC_TEST, mrpc_test_file)
        except urllib.error.HTTPError:
            print("Error downloading MRPC")
            return
    assert os.path.isfile(mrpc_train_file), "Train data not found at %s" % mrpc_train_file
    assert os.path.isfile(mrpc_test_file), "Test data not found at %s" % mrpc_test_file

    with io.open(mrpc_test_file, encoding='utf-8') as data_fh, \
            io.open(os.path.join(mrpc_dir, "test.tsv"), 'w', encoding='utf-8') as test_fh:
        header = data_fh.readline()
        test_fh.write("index\t#1 ID\t#2 ID\t#1 String\t#2 String\n")
        for idx, row in enumerate(data_fh):
            label, id1, id2, s1, s2 = row.strip().split('\t')
            test_fh.write("%d\t%s\t%s\t%s\t%s\n" % (idx, id1, id2, s1, s2))

    # try:
    #     URLLIB.urlretrieve(TASK2PATH["MRPC"], os.path.join(mrpc_dir, "dev_ids.tsv"))
    # except KeyError or urllib.error.HTTPError:
    #     print("\tError downloading standard development IDs for MRPC. You will need to manually split your data.")
    #     return

    dev_ids = []
    with io.open(os.path.join(mrpc_dir, "dev_ids.tsv"), encoding='utf-8') as ids_fh:
        for row in ids_fh:
            dev_ids.append(row.strip().split('\t'))

    with io.open(mrpc_train_file, encoding='utf-8') as data_fh, \
         io.open(os.path.join(mrpc_dir, "train.tsv"), 'w', encoding='utf-8') as train_fh, \
         io.open(os.path.join(mrpc_dir, "dev.tsv"), 'w', encoding='utf-8') as dev_fh:
        header = data_fh.readline()
        train_fh.write(header)
        dev_fh.write(header)
        for row in data_fh:
            label, id1, id2, s1, s2 = row.strip().split('\t')
            if [id1, id2] in dev_ids:
                dev_fh.write("%s\t%s\t%s\t%s\t%s\n" % (label, id1, id2, s1, s2))
            else:
                train_fh.write("%s\t%s\t%s\t%s\t%s\n" % (label, id1, id2, s1, s2))

    print("\tCompleted!")

def download_diagnostic(data_dir):
    print("Downloading and extracting diagnostic...")
    if not os.path.isdir(os.path.join(data_dir, "diagnostic")):
        os.mkdir(os.path.join(data_dir, "diagnostic"))
    data_file = os.path.join(data_dir, "diagnostic", "diagnostic.tsv")
    URLLIB.urlretrieve(TASK2PATH["diagnostic"], data_file)
    print("\tCompleted!")
    return

def get_tasks(task_names):
    task_names = task_names.split(',')
    if "all" in task_names:
        tasks = TASKS
    else:
        tasks = []
        for task_name in task_names:
            assert task_name in TASKS, "Task %s not found!" % task_name
            tasks.append(task_name)
    return tasks

def main(arguments):
    parser = argparse.ArgumentParser()
    parser.add_argument('-d', '--data_dir', help='directory to save data to', type=str, default='glue_data')
    parser.add_argument('-t', '--tasks', help='tasks to download data for as a comma separated string',
                        type=str, default='all')
    parser.add_argument('--path_to_mrpc', help='path to directory containing extracted MRPC data, msr_paraphrase_train.txt and msr_paraphrase_text.txt',
                        type=str, default='')
    args = parser.parse_args(arguments)

    if not os.path.isdir(args.data_dir):
        os.mkdir(args.data_dir)
    tasks = get_tasks(args.tasks)

    for task in tasks:
        if task == 'MRPC':
            format_mrpc(args.data_dir, args.path_to_mrpc)
        elif task == 'diagnostic':
            download_diagnostic(args.data_dir)
        else:
            download_and_extract(task, args.data_dir)


if __name__ == '__main__':
    sys.exit(main(sys.argv[1:]))

Successfully downloaded


[Solved] nested exception is org.flowable.common.engine.api.FlowableException: Error initialising dmn data mo

FactoryBean threw exception on object creation; nested exception is org.flowable.common.engine.api.FlowableException: Error initialising dmn data model

The following error is reported when starting a project inherited from Flowable and Spring Boot:

05.824 INFO  o.f.s.SpringProcessEngineConfiguration:922  main                    Executing configure() of class org.flowable.dmn.spring.configurator.SpringDmnEngineConfigurator (priority:200000)
09.818 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
19.821 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
29.825 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
39.835 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
49.851 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
59.859 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
09.864 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
19.879 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
29.894 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
39.907 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
49.911 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
59.930 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
09.939 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
19.944 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
29.966 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
39.975 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
49.989 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
00.000 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
10.013 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
20.022 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
30.035 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
40.052 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
50.058 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
00.065 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
10.075 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
20.078 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
30.097 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
40.114 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
50.129 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
00.142 INFO  liquibase.lockservice         :26   main                    Waiting for changelog lock....
10.377 WARN  o.s.b.w.s.c.AnnotationConfigServletWebServerApplicationContext:591  main                    Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'appletLeaveController': Unsatisfied dependency expressed through field 'leaveService'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'appletLeaveServiceImpl': Unsatisfied dependency expressed through field 'flowableProcessInstanceService'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'flowableProcessInstanceServiceImpl': Unsatisfied dependency expressed through field 'historyService'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'historyServiceBean' defined in class path resource [org/flowable/spring/boot/ProcessEngineServicesAutoConfiguration.class]: Unsatisfied dependency expressed through method 'historyServiceBean' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'processEngine': FactoryBean threw exception on object creation; nested exception is org.flowable.common.engine.api.FlowableException: Error initialising dmn data model
10.572 INFO  c.a.druid.pool.DruidDataSource:2071 main                    {dataSource-1} closing ...
10.638 INFO  c.a.druid.pool.DruidDataSource:2144 main                    {dataSource-1} closed
10.697 INFO  o.a.c.core.StandardService    :173  main                    Stopping service [Tomcat]
10.981 ERROR o.s.boot.SpringApplication    :830  main                    Application run failed
org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'appletLeaveController': Unsatisfied dependency expressed through field 'leaveService'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'appletLeaveServiceImpl': Unsatisfied dependency expressed through field 'flowableProcessInstanceService'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'flowableProcessInstanceServiceImpl': Unsatisfied dependency expressed through field 'historyService'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'historyServiceBean' defined in class path resource [org/flowable/spring/boot/ProcessEngineServicesAutoConfiguration.class]: Unsatisfied dependency expressed through method 'historyServiceBean' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'processEngine': FactoryBean threw exception on object creation; nested exception is org.flowable.common.engine.api.FlowableException: Error initialising dmn data model
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.resolveFieldValue(AutowiredAnnotationBeanPostProcessor.java:659)
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:639)
	at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:119)
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessProperties(AutowiredAnnotationBeanPostProcessor.java:399)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1431)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:619)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542)
	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:953)
	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:583)
	at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:145)
	at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:740)
	at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:415)
	at org.springframework.boot.SpringApplication.run(SpringApplication.java:303)
	at com.sinosoft.MobileAppletBusinessApplication.main(MobileAppletBusinessApplication.java:32)
Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'appletLeaveServiceImpl': Unsatisfied dependency expressed through field 'flowableProcessInstanceService'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'flowableProcessInstanceServiceImpl': Unsatisfied dependency expressed through field 'historyService'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'historyServiceBean' defined in class path resource [org/flowable/spring/boot/ProcessEngineServicesAutoConfiguration.class]: Unsatisfied dependency expressed through method 'historyServiceBean' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'processEngine': FactoryBean threw exception on object creation; nested exception is org.flowable.common.engine.api.FlowableException: Error initialising dmn data model
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.resolveFieldValue(AutowiredAnnotationBeanPostProcessor.java:659)
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:639)
	at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:119)
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessProperties(AutowiredAnnotationBeanPostProcessor.java:399)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1431)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:619)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542)
	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
	at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:276)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1389)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1309)
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.resolveFieldValue(AutowiredAnnotationBeanPostProcessor.java:656)
	... 18 common frames omitted
Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'flowableProcessInstanceServiceImpl': Unsatisfied dependency expressed through field 'historyService'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'historyServiceBean' defined in class path resource [org/flowable/spring/boot/ProcessEngineServicesAutoConfiguration.class]: Unsatisfied dependency expressed through method 'historyServiceBean' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'processEngine': FactoryBean threw exception on object creation; nested exception is org.flowable.common.engine.api.FlowableException: Error initialising dmn data model
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.resolveFieldValue(AutowiredAnnotationBeanPostProcessor.java:659)
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:639)
	at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:119)
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessProperties(AutowiredAnnotationBeanPostProcessor.java:399)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1431)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:619)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542)
	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
	at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:276)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1389)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1309)
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.resolveFieldValue(AutowiredAnnotationBeanPostProcessor.java:656)
	... 32 common frames omitted
Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'historyServiceBean' defined in class path resource [org/flowable/spring/boot/ProcessEngineServicesAutoConfiguration.class]: Unsatisfied dependency expressed through method 'historyServiceBean' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'processEngine': FactoryBean threw exception on object creation; nested exception is org.flowable.common.engine.api.FlowableException: Error initialising dmn data model
	at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:800)
	at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:541)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1352)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1195)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542)
	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
	at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:276)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1389)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1309)
	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.resolveFieldValue(AutowiredAnnotationBeanPostProcessor.java:656)
	... 46 common frames omitted
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'processEngine': FactoryBean threw exception on object creation; nested exception is org.flowable.common.engine.api.FlowableException: Error initialising dmn data model
	at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:176)
	at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.getObjectFromFactoryBean(FactoryBeanRegistrySupport.java:101)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getObjectForBeanInstance(AbstractBeanFactory.java:1884)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getObjectForBeanInstance(AbstractAutowireCapableBeanFactory.java:1284)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:345)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
	at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:276)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1389)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1309)
	at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:887)
	at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:791)
	... 59 common frames omitted
Caused by: org.flowable.common.engine.api.FlowableException: Error initialising dmn data model
	at org.flowable.common.engine.impl.db.LiquibaseBasedSchemaManager.initSchema(LiquibaseBasedSchemaManager.java:68)
	at org.flowable.dmn.engine.impl.db.DmnDbSchemaManager.initSchema(DmnDbSchemaManager.java:36)
	at org.flowable.dmn.engine.impl.cmd.SchemaOperationsDmnEngineBuild.execute(SchemaOperationsDmnEngineBuild.java:27)
	at org.flowable.dmn.engine.impl.cmd.SchemaOperationsDmnEngineBuild.execute(SchemaOperationsDmnEngineBuild.java:23)
	at org.flowable.common.engine.impl.interceptor.DefaultCommandInvoker.execute(DefaultCommandInvoker.java:22)
	at org.flowable.common.engine.impl.interceptor.TransactionContextInterceptor.execute(TransactionContextInterceptor.java:53)
	at org.flowable.common.engine.impl.interceptor.CommandContextInterceptor.execute(CommandContextInterceptor.java:72)
	at org.flowable.common.spring.SpringTransactionInterceptor.lambda$execute$0(SpringTransactionInterceptor.java:56)
	at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:140)
	at org.flowable.common.spring.SpringTransactionInterceptor.execute(SpringTransactionInterceptor.java:56)
	at org.flowable.common.engine.impl.interceptor.LogInterceptor.execute(LogInterceptor.java:30)
	at org.flowable.common.engine.impl.cfg.CommandExecutorImpl.execute(CommandExecutorImpl.java:56)
	at org.flowable.dmn.engine.impl.DmnEngineImpl.<init>(DmnEngineImpl.java:48)
	at org.flowable.dmn.engine.DmnEngineConfiguration.buildDmnEngine(DmnEngineConfiguration.java:217)
	at org.flowable.dmn.spring.SpringDmnEngineConfiguration.buildDmnEngine(SpringDmnEngineConfiguration.java:69)
	at org.flowable.dmn.spring.configurator.SpringDmnEngineConfigurator.initDmnEngine(SpringDmnEngineConfigurator.java:68)
	at org.flowable.dmn.spring.configurator.SpringDmnEngineConfigurator.configure(SpringDmnEngineConfigurator.java:57)
	at org.flowable.common.engine.impl.AbstractEngineConfiguration.configuratorsAfterInit(AbstractEngineConfiguration.java:923)
	at org.flowable.engine.impl.cfg.ProcessEngineConfigurationImpl.init(ProcessEngineConfigurationImpl.java:1016)
	at org.flowable.engine.impl.cfg.ProcessEngineConfigurationImpl.buildProcessEngine(ProcessEngineConfigurationImpl.java:915)
	at org.flowable.spring.SpringProcessEngineConfiguration.buildProcessEngine(SpringProcessEngineConfiguration.java:72)
	at org.flowable.spring.ProcessEngineFactoryBean.getObject(ProcessEngineFactoryBean.java:60)
	at org.flowable.spring.ProcessEngineFactoryBean.getObject(ProcessEngineFactoryBean.java:32)
	at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:169)
	... 69 common frames omitted
Caused by: org.flowable.common.engine.api.FlowableException: Error updating dmn engine tables
	at org.flowable.common.engine.impl.db.LiquibaseBasedSchemaManager.schemaUpdate(LiquibaseBasedSchemaManager.java:105)
	at org.flowable.common.engine.impl.db.LiquibaseBasedSchemaManager.initSchema(LiquibaseBasedSchemaManager.java:61)
	... 92 common frames omitted
Caused by: liquibase.exception.LockException: Could not acquire change log lock.  Currently locked by DESKTOP-T9LSH1H (192.168.1.108) since 22-6-15 上午9:27
	at liquibase.lockservice.StandardLockService.waitForLock(StandardLockService.java:234)
	at liquibase.Liquibase.lambda$update$1(Liquibase.java:214)
	at liquibase.Scope.lambda$child$0(Scope.java:177)
	at liquibase.Scope.child(Scope.java:186)
	at liquibase.Scope.child(Scope.java:176)
	at liquibase.Scope.child(Scope.java:155)
	at liquibase.Liquibase.runInScope(Liquibase.java:2404)
	at liquibase.Liquibase.update(Liquibase.java:211)
	at liquibase.Liquibase.update(Liquibase.java:197)
	at liquibase.Liquibase.update(Liquibase.java:193)
	at liquibase.Liquibase.update(Liquibase.java:185)
	at org.flowable.common.engine.impl.db.LiquibaseBasedSchemaManager.schemaUpdate(LiquibaseBasedSchemaManager.java:103)
	... 93 common frames omitted

Process finished with exit code 1

After investigation, it was found that the problem was MySQL

Solution: find a table whose table name is suffixed with databasechangeloglock and change it to the format shown in the following figure to solve the problem

Note: all tables suffixed with databasechangeloglock should be checked

python chatterbot [nltk_data] Error loading stopwords: <urlopen error [Errno 11004]

The following error occurred while running the project:

[nltk_data] Error loading stopwords: <urlopen error [Errno 11004]
[nltk_data]     getaddrinfo failed>
[nltk_data] Error loading averaged_perceptron_tagger: <urlopen error
[nltk_data]     [Errno 11004] getaddrinfo failed>

The Solution is as follows:

Go to: https://github.com/nltk/nltk_data

Go to the directory /packages/corpora/ and find the corresponding file stopwords.zip and put it under the corresponding file

It is recommended that the entire nltk_data project is downloaded with a size of 695M to avoid other problems that cannot be downloaded!

Extract the zip file

nltk_data-gh-pages.zip\nltk_data-gh-pages\packages

all files to the following directory

C:\Users\Administrator\AppData\Roaming\nltk_data

Here the installation directory may be different for each person, here I am in the above directory.

Modify the corresponding file.

\venv\Lib\site-packages\chatterbot\utils.py under the current project directory

(Some children’s directory may not be under the current project, you can find the corresponding site-packages directory according to your own configuration and then find the corresponding files to modify)

The corresponding code nltk_download_corpus(‘xxx’) needs to be modified as follows:


def download_nltk_stopwords():
    """
    Download required NLTK stopwords corpus if it has not already been downloaded.
    """
    nltk_download_corpus('corpora/stopwords')


def download_nltk_wordnet():
    """
    Download required NLTK corpora if they have not already been downloaded.
    """
    nltk_download_corpus('corpora/wordnet')


def download_nltk_averaged_perceptron_tagger():
    """
    Download the NLTK averaged perceptron tagger that is required for this algorithm
    to run only if the corpora has not already been downloaded.
    """
    nltk_download_corpus('taggers/averaged_perceptron_tagger')


def download_nltk_vader_lexicon():
    """
    Download the NLTK vader lexicon for sentiment analysis
    that is required for this algorithm to run.
    """
    nltk_download_corpus('sentiment/vader_lexicon')

Done!

[Solved] ERROR: Unknown host CPU architecture: arm64

ERROR: Unknown host CPU architecture: arm64

When compiling the android ndk project built on Android.mk, I found that the following error occurs on the m1 macbook pro

ERROR: Unknown host CPU architecture: arm64

Need to modify the ndk-build file in the ndk root directory (presumably because the m1 belongs to the arm architecture)

#!/bin/sh
DIR="$(cd "$(dirname "$0")" && pwd)"
$DIR/build/ndk-build "[email protected]"

Change to

#!/bin/sh
DIR="$(cd "$(dirname "$0")" && pwd)"
arch -x86_64 /bin/bash $DIR/build/ndk-build "[email protected]"

[Solved] Error 4 opening dom ASM/Self in 0x8283c00

Installing Oracle RAC 19.3.0.0 on RHEL 7.9, in the run root.sh script step of the installation GI, it runs normally on the first node, but Error 4 opening dom ASM/Self in 0x8283c00 occurs when running the root.sh script on the second node

Root.sh script executed successfully in node 1

Problem running root.sh script on node 2

Creating /etc/oratab file...
Entries will be added to the /etc/oratab file as needed by
Database Configuration Assistant when a database is created
Finished running generic part of root script.
Now product-specific root actions will be performed.
Relinking oracle with rac_on option
Using configuration parameter file: /u01/app/19.0.0/grid/crs/install/crsconfig_params
The log of current session can be found at:
  /u01/app/grid/crsdata/momdb2/crsconfig/rootcrs_momdb2_2022-06-19_11-05-10AM.log
2022/06/19 11:05:13 CLSRSC-594: Executing installation step 1 of 19: 'SetupTFA'.
2022/06/19 11:05:14 CLSRSC-594: Executing installation step 2 of 19: 'ValidateEnv'.
2022/06/19 11:05:14 CLSRSC-363: User ignored prerequisites during installation
2022/06/19 11:05:14 CLSRSC-594: Executing installation step 3 of 19: 'CheckFirstNode'.
2022/06/19 11:05:14 CLSRSC-594: Executing installation step 4 of 19: 'GenSiteGUIDs'.
2022/06/19 11:05:14 CLSRSC-594: Executing installation step 5 of 19: 'SetupOSD'.
2022/06/19 11:05:14 CLSRSC-594: Executing installation step 6 of 19: 'CheckCRSConfig'.
2022/06/19 11:05:15 CLSRSC-594: Executing installation step 7 of 19: 'SetupLocalGPNP'.
2022/06/19 11:05:16 CLSRSC-594: Executing installation step 8 of 19: 'CreateRootCert'.
2022/06/19 11:05:16 CLSRSC-594: Executing installation step 9 of 19: 'ConfigOLR'.
2022/06/19 11:05:23 CLSRSC-594: Executing installation step 10 of 19: 'ConfigCHMOS'.
2022/06/19 11:05:23 CLSRSC-594: Executing installation step 11 of 19: 'CreateOHASD'.
2022/06/19 11:05:24 CLSRSC-594: Executing installation step 12 of 19: 'ConfigOHASD'.
2022/06/19 11:05:24 CLSRSC-330: Adding Clusterware entries to file 'oracle-ohasd.service'
2022/06/19 11:05:35 CLSRSC-4002: Successfully installed Oracle Trace File Analyzer (TFA) Collector.
2022/06/19 11:06:01 CLSRSC-594: Executing installation step 13 of 19: 'InstallAFD'.
2022/06/19 11:06:27 CLSRSC-594: Executing installation step 14 of 19: 'InstallACFS'.
2022/06/19 11:07:02 CLSRSC-594: Executing installation step 15 of 19: 'InstallKA'.
2022/06/19 11:07:03 CLSRSC-594: Executing installation step 16 of 19: 'InitConfig'.
2022/06/19 11:07:10 CLSRSC-594: Executing installation step 17 of 19: 'StartCluster'.
2022/06/19 11:11:03 CLSRSC-343: Successfully started Oracle Clusterware stack
2022/06/19 11:11:03 CLSRSC-594: Executing installation step 18 of 19: 'ConfigNode'.
2022/06/19 11:11:11 CLSRSC-594: Executing installation step 19 of 19: 'PostConfig'.
2022/06/19 11:11:26 CLSRSC-325: Configure Oracle Grid Infrastructure for a Cluster ... succeeded
***Error 4 opening dom ASM/Self in 0x8283c00
Domain name to open is ASM/Self 
Error 4 opening dom ASM/Self in 0x8283c00***

According to MOS: 19C: While Executing Root.sh on Remote Nodes HIT UNEXPECTED “ERROR 4 OPENING DOM ASM/SELF IN 0x57f7d60” (Doc ID 2571719.1) description, this issue has no effect on the installation and can be ignored

[Solved] .NetCore2.2 Upgrade to 3.1 Error: HTTP Error 500.37 – ANCM Failed to Start Within Startup Time

During the upgrade of an old project from netcore2.2 to 3.1, an Ocelot gateway interface program changes the project file

- <TargetFramework>netcoreapp2.2</TargetFramework> 
+ <TargetFramework>netcoreapp3.1</TargetFramework>

Replace the Services.Addmvc() in your project:

services.AddMvc(options => { options.EnableEndpointRouting = false; });
services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Version_3_0);

The method Configure(IApplicationBuilder app, IWebHostEnvironment env) retains the original app.UseMvc();
The compiler passed, but the runtime page reported an error: HTTP Error 500.37 – ANCM Failed to Start Within Startup Time, searched the web for a solution and gave a way to modify the startupTimeLimit value.

<?xml version="1.0"?>
<configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
  <location>
    <system.webServer>
      <aspNetCore xdt:Transform="SetAttributes(startupTimeLimit)" startupTimeLimit="300">
      </aspNetCore>
    </system.webServer>
  </location>
</configuration>

The run result page is directly inaccessible.

 

Solution:

You only need to add a project node
< AspNetCoreHostingModel> OutOfProcess OK.

  <PropertyGroup>
    <TargetFramework>netcoreapp3.1</TargetFramework>
	<AspNetCoreHostingModel>OutOfProcess</AspNetCoreHostingModel>
  </PropertyGroup>

After doing the above, the project runs normally. In addition, the above services.AddMvc() is not the recommended solution.

AddMvc(); comment out app.UseMvc() in the Configure method; then add the following code.

            //app.UseMvc();
            app.UseRouting();//add
            app.UseEndpoints(endpoints =>//add
            {
                endpoints.MapControllers();
            });

 

[Solved] RuntimeError: cuda runtime error (801) : operation not supported at

cuda runtime error (801) : Raw out

Error:
RuntimeError: cuda runtime error (801) : operation not supported at C:\w\1\s\windows\pytorch\torch/csrc/generic/StorageSharing.cpp:245 #85

Reason:
Guess, windows does not support multitasking

Solution:

    layer_loader = NeighborSampler(data.adj_t, node_idx=None, sizes=[-1], batch_size=4096, shuffle=False, num_workers=12)

For example, the above code

Delete numwork directly

layer_loader = NeighborSampler(data.adj_t, node_idx=None, sizes=[-1], batch_size=4096, shuffle=False)

 

[Solved] Error while processing statement: FAILED: Execution Error, return code 3 from org.apache.

Error while processing statement: FAILED: Execution Error, return code 3 from org.apache.

This error occurs when executing sql with a collection in the hive on spark engine:
[42000][3] Error while processing statement: FAILED: Execution Error, return code 3 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Spark job failed during runtime. Please check stacktrace for the root cause.

 

Solution 1: Switching engine using mr
set hive.execution.engine=mr;

Solution 2:
set mapred.map.tasks.speculative.execution=true
set mapred.reduce.tasks.speculative.execution=true

 

[Solved] ERROR: Unexpected bus error encountered in worker. This might be caused by insufficient shared memor

ERROR: Unexpected bus error encountered in worker. This might be caused by insufficient shared memor

1. question

Using pytorch dataloader in docker may cause the following errors:

2. solution

View disk usage through df -h in docker:

You can see that /dev/shm is only 64M, but the data_loader has more num_works set, and it is collaborating through shared memory, resulting in insufficient memory.

Please note that PyTorch uses shared memory to share data between processes, so if torch multiprocessing is used (e.g. for multithreaded data loaders) the default shared memory segment size that container runs with is not enough, and you should increase shared memory size either with –ipc=host or –shm-size command line options to nvidia-docker run.

Solution:
(1) num_workers=0 (note that setting it to 1 does not work)
(2) docker is easy to share more memory:

--ipc=host  or --shm-size 8G
where -ipc=host will be adjusted according to the current host memory maximum, it is recommended to use this method

After restart:

 

[Solved] error: #error “Please include Eigen/Geometry instead of including headers inside the src directory d

error: #error “Please include Eigen/Geometry instead of including headers inside the src directory d

1-error
2-solution
prompt us to \include < Eigen/Geometry>, the header file related to vim should under Panglin/include/pangolin/plot/rang.h
but a new error is reported. There is no such file or directory as eigen/geometry

Solution:

cd /usr/include
sudo ln -sf eigen3/Eigen Eigen
sudo ln -sf eigen3/unsupported unsupported