GuzzleHttp\Exception\RequestException: cURL error 18: transfer closed with outstanding read data remaining (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)
A direct request from Postman sent back data, but it didn’t automatically beautify the format, and a request from curl sent back data plus an error message.
Solutions:
(Only for the solution of the problem encountered this time, the same error may be caused by different reasons)
Both server Nginx and FPM are run by WWW users. Set Nginx users as WWW and unify the permission configuration of LNMP system:
chown -R www.www /var/lib/nginx/
Solution process:
Error message prompt
http://curl.haxx.se/libcurl/c/libcurl-errors.html
CURLE_PARTIAL_FILE (18)
A file transfer was shorter or larger than expected. This happens when the server first reports an expected transfer size, and then delivers data that doesn't match the previously given size.
The data transmitted is not consistent with the expected data size.
At first, we used postman to ask for an error in the package method in our project, and there was data sent back. Therefore, we thought there was something wrong with the guzzle method and sent the wrong request header. We spent a lot of time testing the keep-alive, content-Length, Transfer-Encoding: Chunked and other attributes of HTTP header HTTP1.1, but it didn’t work.
Careful use of the curl command to request the interface also found the error message, so I focused on the configuration of the server side. First, I estimated that the maximum amount of data transferred by Nginx or FPM was too small, but the maximum amount of data returned was not more than 100K, which should not exceed the configuration size.
Then search “PHP Nginx HTTP interface returns the maximum content” on Baidu.
https://blog.csdn.net/sakurallj/article/details/51822828
http://www.dewen.net.cn/q/1913
Return if PHP content is too big, nginx will put part of first deposit into a text file/var/lib/nginx/TMP/fastcgi, after receiving all content, such as to be sent to the client. However, nginx execution users do not have write permissions for TMP files under Nginx.
Then I checked nginx’s error_log and found that there was an error message. Later, the user role of LNMP was unified and the problem was solved…
2018/12/13 18:41:31 [crit] 30392#0: *380874 open() "/var/lib/nginx/tmp/fastcgi/3/43/0000000433" failed (13: Permission denied) while reading upstream, client: 182.18.28.66, server: , request: "GET /v1/user/list?page=1&perpage=50 HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "123.123.123.13:11008”
Feeling, the user mechanism of Linux is not enough to understand deeply, the HTTP protocol a little understanding,,, encounter weird problems, the first step should always be to check the log!
Read More:
- Python read / write file error valueerror: I/O operation on closed file
- Consult IDE log for more details (Help | Show Log),read failed, socket might closed or timeout,
- Using AspectJ to transfer the data written to FTP service to MySQL database
- Unable to read meta data for class
- Pandas read_ Error in json() valueerror: training data
- Driver failed programming external connectivity on endpoint quirky_ allen
- Jetson nano uses Python to read and parse GPS data (GPRMC, gpgga).
- The problem of error in adaptation of Vue using vant mobile terminal rem
- java.lang.IllegalStateException : unable to read meta data for class
- Mixing iteration and read methods would lose data
- Ubuntu 20.04 builds nginx 1.18.0 and MAC file transfer / download to server
- C + + programming fault handling — error: assignment of read only data member ‘STD:: pair
- Depending on the custom component jar package, unable to read meta data for class appears after startup
- Inconsistency between adapter data and UI data after dragging recyclerview (data disorder)
- 107_ Error report and solution of erc20 token transfer
- circuit_breaking_exception,“reason“:“[parent] Data too large, data for [<http_request>]
- Add a new springboot, and it will appear unable to read meta data for class when it is dependent
- Xftp transfer file error
- error: RPC failed; curl 92 HTTP/2 stream 0 was not closed cleanly: CANCEL (err 8)
- Interesting undefined columns selected from read.table and read.csv