Tag Archives: Tools

[Solved] ERROR Error: No module factory availabl at Object.PROJECT_CONFIG_JSON_NOT_VALID_OR_NOT_EXIST ‘Error

1. Use wechat one-click packaging tool report the following error:

ERROR Error: No module factory available for dependency type: CssDependency

‘error: please check project.config Whether JSON exists and is valid (code 19) error: please check

2. Solutions

Hbuilder x development tools cannot be placed on the same disk as project files. The above errors will occur. If the development tools are placed on disk D and the project files are placed on disk e or disk F, the above errors will not be reported

How to Compress them with thumbrails When uploading pictures

When uploading large images, the speed is very slow, so the images are compressed and then stored

Thumbnailator website: http://code.google.com/p/thumbnailator/

Add in POM

   <dependency>
        <groupId>net.coobird</groupId>
        <artifactId>thumbnailator</artifactId>
        <version>0.4.8</version>
    </dependency>

controller

@PostMapping("/upload")
    public Object upload(@RequestParam("file") MultipartFile file) throws IOException {
//        if(file.getSize() > 2 * ONE_MB){
//            return ResponseUtil.fail(500,"Image size exceeds 2M!");
//        }
        String originalFilename = file.getOriginalFilename();
        logger.info("Before compression:"+file.getSize());
        ByteArrayInputStream inputStream = uploadFile(file);
        logger.info("After compression:"+inputStream.available());
        LitemallStorage litemallStorage = storageService.store(inputStream, inputStream.available(), file.getContentType(), originalFilename);
        return ResponseUtil.ok(litemallStorage);
    }

    /**
     * image compression
     * @return
     */
    public static ByteArrayInputStream uploadFile(MultipartFile file){
        if(file == null)return  null;
        ByteArrayOutputStream baos = null;
        try {
            baos = new ByteArrayOutputStream();
            Thumbnails.of(file.getInputStream()).scale(0.4f).outputQuality(0.25f).toOutputStream(baos);
            if(baos!=null)
                return parse(baos);
        } catch (Exception e) {
            e.printStackTrace();
        }
        return null;
    }


    // outputStream转inputStream
    public static ByteArrayInputStream parse(OutputStream out) throws Exception {
        ByteArrayOutputStream baos = new ByteArrayOutputStream();
        baos = (ByteArrayOutputStream) out;
        ByteArrayInputStream swapStream = new ByteArrayInputStream(baos.toByteArray());
        return swapStream;
    }

service:

 /**
     * Store a file object
     *
     * @param inputStream File input stream
     * @param contentLength The length of the file.
     * @param contentType File type
     * @param fileName File index name
     */
    public LitemallStorage store(InputStream inputStream, long contentLength, String contentType, String fileName) {
        String key = generateKey(fileName);
        store(inputStream, contentLength, contentType, key);

        String url = generateUrl(key);
        LitemallStorage storageInfo = new LitemallStorage();
        storageInfo.setName(fileName);
        storageInfo.setSize((int) contentLength);
        storageInfo.setType(contentType);
        storageInfo.setKey(key);
        storageInfo.setUrl(url);
        litemallStorageService.add(storageInfo);

        return storageInfo;
    }


    @Override
    public void store(InputStream inputStream, long contentLength, String contentType, String keyName) {
        try {
            // Simple file upload, supports up to 5 GB, suitable for small file upload, recommend using this interface for files under 20M
            ObjectMetadata objectMetadata = new ObjectMetadata();
            objectMetadata.setContentLength(contentLength);
            objectMetadata.setContentType(contentType);
            // The object key (Key) is the unique identifier of the object in the storage bucket.
            PutObjectRequest putObjectRequest = new PutObjectRequest(bucketName, keyName, inputStream, objectMetadata);
            PutObjectResult putObjectResult = getOSSClient().putObject(putObjectRequest);
        } catch (Exception ex) {
            logger.error(ex.getMessage(), ex);
        }

    }

Other methods of thumbnailator:

//Shrink and put the image by the specified size (will follow the original image height and width ratio)
// Here the image is compressed into a 400×500 thumbnail
Thumbnails.of(fromPic).size(400,500).toFile(toPic);

Zoom in and out according to the specified ratio // Zoom in and out according to the ratio
Thumbnails.of(fromPic).scale(0.2f).toFile(toPic);//scale down and scale up
Thumbnails.of(fromPic).scale(2f);//proportionally enlarge the image size, compress the image file size

// image size remains the same, compress the image file size outputQuality implementation, parameter 1 is the highest quality
Thumbnails.of(fromPic).scale(1f).outputQuality(0.25f).toFile(toPic);

Keras saves save() and save in the model_ weights()

Today, we did an experiment on the model saved by keras, hoping to help you understand the differences between the models saved by keras.


We know that the model of keras is usually saved as a file with the suffix H5, such as final_ model.h5。 The same H5 file uses save () and save_ The effect of weight () is different.

We use MNIST, the most common data set in the universe, to do this experiment

inputs = Input(shape=(784, ))
x = Dense(64, activation='relu')(inputs)
x = Dense(64, activation='relu')(x)
y = Dense(10, activation='softmax')(x)

model = Model(inputs=inputs, outputs=y)

Then, import MNIST data for training, and save the model in two ways. Here, I also save the untrained model, as follows:

from keras.models import Model
from keras.layers import Input, Dense
from keras.datasets import mnist
from keras.utils import np_utils


(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train=x_train.reshape(x_train.shape[0],-1)/255.0
x_test=x_test.reshape(x_test.shape[0],-1)/255.0
y_train=np_utils.to_categorical(y_train,num_classes=10)
y_test=np_utils.to_categorical(y_test,num_classes=10)

inputs = Input(shape=(784, ))
x = Dense(64, activation='relu')(inputs)
x = Dense(64, activation='relu')(x)
y = Dense(10, activation='softmax')(x)

model = Model(inputs=inputs, outputs=y)

model.save('m1.h5')
model.summary()
model.compile(loss='categorical_crossentropy', optimizer='sgd', metrics=['accuracy'])
model.fit(x_train, y_train, batch_size=32, epochs=10)
#loss,accuracy=model.evaluate(x_test,y_test)

model.save('m2.h5')
model.save_weights('m3.h5')

As you can see, I have saved m1.h5, m2.h5 and m3.h5 files. So, let’s see what’s the difference between these three things. First, look at the size:

M2 represents the result of the model saved by save (), which not only keeps the graph structure of the model, but also saves the parameters of the model. So it’s the biggest.

M1 represents the result of the model before training saved by save (). It saves the graph structure of the model, but it should not save the initialization parameters of the model, so its size is much smaller than m2.

M3 means save_ Weights () saves the results of the model. It only saves the parameters of the model, but it does not save the graph structure of the model. So it’s much smaller than m2.

 

Through the visualization tool, we found that: (open M1 and M2 can show the following structure)

When opening m3, the visualization tool reported an error. So it can be proved that save_ Weights () does not contain model structure information.


Loading model

The model files saved by two different methods also need different loading methods.

from keras.models import load_model

model = load_model('m1.h5')
#model = load_model('m2.h5')
#model = load_model('m3.h5')
model.summary()

Only when loading m3. H5, this code will report an error. Other outputs are as follows:

It can be seen that only the H5 file saved by save() can be downloaded directly_ Model () open!

So, how can we open the saved parameter (M3. H5)?

This is a little more complicated. Because m3 does not contain model structure information, we need to describe the model structure again before loading m3, as follows:

from keras.models import Model
from keras.layers import Input, Dense


inputs = Input(shape=(784, ))
x = Dense(64, activation='relu')(inputs)
x = Dense(64, activation='relu')(x)
y = Dense(10, activation='softmax')(x)

model = Model(inputs=inputs, outputs=y)
model.load_weights('m3.h5')

The above m3 into M1 and M2 is no problem! It can be seen that the model saved by save () has obvious advantages except that it takes up more memory. Therefore, in the case of no lack of hard disk space, it is recommended that you use save () to save more.

be careful! If you want to load_ Weights (), you must ensure that the calculation structure with parameters described by you is completely consistent with that in H5 file! What is parametric computing structure?Just fill in the parameter pit. We changed the above non parametric structure and found that H5 files can still be loaded successfully. For example, changing softmax to relu does not affect the loading.

 

For save() and save() of keras_ No problem at all

 

Thoroughly solve check_ NRPE: Error – Could not complete SSL handshake.

Appears with the Error “CHECK_NRPE: error-could not complete SSL handshake.”
Steps to solve the problem:
1. Ensure that opensSH, OpensSL, opensSL-Devel versions are consistent.
On centOS, you can run:

yum install openssl openssl-devel

2. The allowed address of nagios monitoring terminal and the nrPE target terminal are configured correctly. For example, the configuration on the destination side

(command: vim/usr/local/nagios/etc/nrpe CFG) :

allowed_hosts=127.0.0.1,192.168.177.174

Version problem of SQL Server


Recently, SQL Server 2005 has been installed on the computer. It works normally on this computer. Everything is OK.

When connected to the server operation Database, however, you Unspecified error when creating the table:

The details of the error are as follows:

===================================


Unspecified error
 (MS Visual Database Tools)


------------------------------
Program Location:


   at Microsoft.VisualStudio.DataTools.Interop.IDTTableDesignerFactory.NewTable(Object dsRef, Object pServiceProvider)
   at Microsoft.SqlServer.Management.UI.VSIntegration.Editors.TableDesignerNode.CreateDesigner(IDTDocToolFactoryProvider factoryProvider, IVsDataConnection dataConnection)
   at Microsoft.SqlServer.Management.UI.VSIntegration.Editors.VsDataDesignerNode.CreateDesigner()
   at Microsoft.SqlServer.Management.UI.VSIntegration.Editors.VsDataDesignerNode.Open()
   at Microsoft.SqlServer.Management.UI.VSIntegration.Editors.VirtualProject.Microsoft.SqlServer.Management.UI.VSIntegration.Editors.ISqlVirtualProject.CreateDesigner(Urn origUrn, DocumentType editorType, DocumentOptions aeOptions, IManagedConnection con)
   at Microsoft.SqlServer.Management.UI.VSIntegration.Editors.ISqlVirtualProject.CreateDesigner(Urn origUrn, DocumentType editorType, DocumentOptions aeOptions, IManagedConnection con)
   at Microsoft.SqlServer.Management.UI.VSIntegration.Editors.ScriptFactory.CreateDesigner(DocumentType editorType, DocumentOptions aeOptions, Urn parentUrn, IManagedConnection mc)
   at Microsoft.SqlServer.Management.UI.VSIntegration.Editors.VsDocumentMenuItem.CreateDesignerWindow(IManagedConnection mc, DocumentOptions options)

=========================================

checked on the Internet, the original problem was between SQL Server versions: SQL Server 2005 Management Studio could not operate SQL Server 2008, so this error occurred.

The solution is to replace 2005 with 2008.

E667: Fsync failed

edit file /proc/sys/kernel/core_pattern by vim, save E667: Fsync failed
echo “core-%e-%p-%t” | sudo dd of=/proc/sys/kernel/core_pattern

reference: https://askubuntu.com/questions/167819/im-getting-fsync-failed-error-why

Mac ports installation software prompts “warning: xcodebuild exists but failed to execute”


# port search imagemagick

checked on the Internet, tried the following method, success.

The steps which solved for me were:

1.install Xcode 4.3 from Mac App Store

2. Install Command Line Tools for Xcode

3. sudo /usr/bin/xcode-select -switch /Applications/Xcode.app

4. sudo ln -s `which clang` /Applications/Xcode.app/Contents/Developer/usr/bin/

Even so, this is clearly a workaround.

my xcode is downloadable DMG, perform dynamic mount every time I start, so the directory is /Volumes/ xcode. app/…

for users from appstore installation goal should be to/Application/Xcode. The app /…

, I only have to do step3 to succeed.

How to solve problems like curl: (7) failed to connect to raw.githubusercontent.com Port 443: problem with connection used

Background of

the following error was reported when installing my-zsh

sh -c "$(curl -fsSL https://raw.github.com/ohmyzsh/ohmyzsh/master/tools/install.sh)"
curl: (7) Failed to connect to raw.githubusercontent.com port 443: Connection refused

I recently noticed that github users’ profile pictures and images from my posts no longer show up. Then today I found an error message for the title when homeBrew and NVM were installed.

these are installed PNPM error message, can be found that the script needs to pull the code on raw.githubusercontent.com.

Internet search, found that some github domain name DNS resolution is contaminated, causing DNS resolution process can not get the correct IP address through the domain name.

DNS pollution

DNS pollution, interested friends can go to understand.

solution

open https://www.ipaddress.com/ input can’t access the domain name

query can get the correct IP address

is added in the host file of the machine. It is recommended to use switchhosts to facilitate host management of

199.232.68.133 raw.githubusercontent.com 199.232.68.133 user-images.githubusercontent.com
199.232.68.133 avatars2.githubusercontent.com
199.232.68.133 avatars1.githubusercontent.com

add the above several host configuration, the page picture display will be normal, homebrew can also be installed, NVM action is flexible.

Reference

https://github.com/hawtim/blog/issues/10