1 _ _ ____ _ 2 Project ___| | | | _ \| | 3 / __| | | | |_) | | 4 | (__| |_| | _ <| |___ 5 \___|\___/|_| \_\_____| 6 7NAME 8 curl - transfer a URL 9 10SYNOPSIS 11 curl [options] [URL...] 12 13DESCRIPTION 14 curl is a tool to transfer data from or to a server, using one of the 15 supported protocols (DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, 16 IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMTP, SMTPS, 17 TELNET and TFTP). The command is designed to work without user inter- 18 action. 19 20 curl offers a busload of useful tricks like proxy support, user authen- 21 tication, FTP upload, HTTP post, SSL connections, cookies, file trans- 22 fer resume, Metalink, and more. As you will see below, the number of 23 features will make your head spin! 24 25 curl is powered by libcurl for all transfer-related features. See 26 libcurl(3) for details. 27 28URL 29 The URL syntax is protocol-dependent. You'll find a detailed descrip- 30 tion in RFC 3986. 31 32 You can specify multiple URLs or parts of URLs by writing part sets 33 within braces as in: 34 35 http://site.{one,two,three}.com 36 37 or you can get sequences of alphanumeric series by using [] as in: 38 39 ftp://ftp.numericals.com/file[1-100].txt 40 ftp://ftp.numericals.com/file[001-100].txt (with leading zeros) 41 ftp://ftp.letters.com/file[a-z].txt 42 43 Nested sequences are not supported, but you can use several ones next 44 to each other: 45 46 http://any.org/archive[1996-1999]/vol[1-4]/part{a,b,c}.html 47 48 You can specify any amount of URLs on the command line. They will be 49 fetched in a sequential manner in the specified order. 50 51 You can specify a step counter for the ranges to get every Nth number 52 or letter: 53 54 http://www.numericals.com/file[1-100:10].txt 55 http://www.letters.com/file[a-z:2].txt 56 57 If you specify URL without protocol:// prefix, curl will attempt to 58 guess what protocol you might want. It will then default to HTTP but 59 try other protocols based on often-used host name prefixes. For exam- 60 ple, for host names starting with "ftp." curl will assume you want to 61 speak FTP. 62 63 curl will do its best to use what you pass to it as a URL. It is not 64 trying to validate it as a syntactically correct URL by any means but 65 is instead very liberal with what it accepts. 66 67 curl will attempt to re-use connections for multiple file transfers, so 68 that getting many files from the same server will not do multiple con- 69 nects / handshakes. This improves speed. Of course this is only done on 70 files specified on a single command line and cannot be used between 71 separate curl invokes. 72 73PROGRESS METER 74 curl normally displays a progress meter during operations, indicating 75 the amount of transferred data, transfer speeds and estimated time 76 left, etc. 77 78 curl displays this data to the terminal by default, so if you invoke 79 curl to do an operation and it is about to write data to the terminal, 80 it disables the progress meter as otherwise it would mess up the output 81 mixing progress meter and response data. 82 83 If you want a progress meter for HTTP POST or PUT requests, you need to 84 redirect the response output to a file, using shell redirect (>), -o 85 [file] or similar. 86 87 It is not the same case for FTP upload as that operation does not spit 88 out any response data to the terminal. 89 90 If you prefer a progress "bar" instead of the regular meter, -# is your 91 friend. 92OPTIONS 93 Options start with one or two dashes. Many of the options require an 94 addition value next to it. 95 96 The short "single-dash" form of the options, -d for example, may be 97 used with or without a space between it and its value, although a space 98 is a recommended separator. The long "double-dash" form, --data for 99 example, requires a space between it and its value. 100 101 Short version options that don't need any additional values can be used 102 immediately next to each other, like for example you can specify all 103 the options -O, -L and -v at once as -OLv. 104 105 In general, all boolean options are enabled with --option and yet again 106 disabled with --no-option. That is, you use the exact same option name 107 but prefix it with "no-". However, in this list we mostly only list and 108 show the --option version of them. (This concept with --no options was 109 added in 7.19.0. Previously most options were toggled on/off on 110 repeated use of the same command line option.) 111 112 -#, --progress-bar 113 Make curl display progress as a simple progress bar instead of 114 the standard, more informational, meter. 115 116 -0, --http1.0 117 (HTTP) Tells curl to use HTTP version 1.0 instead of using its 118 internally preferred: HTTP 1.1. 119 120 --http1.1 121 (HTTP) Tells curl to use HTTP version 1.1. This is the internal 122 default version. (Added in 7.33.0) 123 124 --http2.0 125 (HTTP) Tells curl to issue its requests using HTTP 2.0. This 126 requires that the underlying libcurl was built to support it. 127 (Added in 7.33.0) 128 129 -1, --tlsv1 130 (SSL) Forces curl to use TLS version 1 when negotiating with a 131 remote TLS server. 132 133 -2, --sslv2 134 (SSL) Forces curl to use SSL version 2 when negotiating with a 135 remote SSL server. 136 137 -3, --sslv3 138 (SSL) Forces curl to use SSL version 3 when negotiating with a 139 remote SSL server. 140 141 -4, --ipv4 142 If curl is capable of resolving an address to multiple IP ver- 143 sions (which it is if it is IPv6-capable), this option tells 144 curl to resolve names to IPv4 addresses only. 145 146 -6, --ipv6 147 If curl is capable of resolving an address to multiple IP ver- 148 sions (which it is if it is IPv6-capable), this option tells 149 curl to resolve names to IPv6 addresses only. 150 151 -a, --append 152 (FTP/SFTP) When used in an upload, this will tell curl to append 153 to the target file instead of overwriting it. If the file 154 doesn't exist, it will be created. Note that this flag is 155 ignored by some SSH servers (including OpenSSH). 156 157 -A, --user-agent <agent string> 158 (HTTP) Specify the User-Agent string to send to the HTTP server. 159 Some badly done CGIs fail if this field isn't set to 160 "Mozilla/4.0". To encode blanks in the string, surround the 161 string with single quote marks. This can also be set with the 162 -H, --header option of course. 163 164 If this option is used several times, the last one will be used. 165 166 --anyauth 167 (HTTP) Tells curl to figure out authentication method by itself, 168 and use the most secure one the remote site claims to support. 169 This is done by first doing a request and checking the response- 170 headers, thus possibly inducing an extra network round-trip. 171 This is used instead of setting a specific authentication 172 method, which you can do with --basic, --digest, --ntlm, and 173 --negotiate. 174 175 Note that using --anyauth is not recommended if you do uploads 176 from stdin, since it may require data to be sent twice and then 177 the client must be able to rewind. If the need should arise when 178 uploading from stdin, the upload operation will fail. 179 180 -b, --cookie <name=data> 181 (HTTP) Pass the data to the HTTP server as a cookie. It is sup- 182 posedly the data previously received from the server in a "Set- 183 Cookie:" line. The data should be in the format "NAME1=VALUE1; 184 NAME2=VALUE2". 185 186 If no '=' symbol is used in the line, it is treated as a file- 187 name to use to read previously stored cookie lines from, which 188 should be used in this session if they match. Using this method 189 also activates the "cookie parser" which will make curl record 190 incoming cookies too, which may be handy if you're using this in 191 combination with the -L, --location option. The file format of 192 the file to read cookies from should be plain HTTP headers or 193 the Netscape/Mozilla cookie file format. 194 195 NOTE that the file specified with -b, --cookie is only used as 196 input. No cookies will be stored in the file. To store cookies, 197 use the -c, --cookie-jar option or you could even save the HTTP 198 headers to a file using -D, --dump-header! 199 200 If this option is used several times, the last one will be used. 201 202 -B, --use-ascii 203 (FTP/LDAP) Enable ASCII transfer. For FTP, this can also be 204 enforced by using an URL that ends with ";type=A". This option 205 causes data sent to stdout to be in text mode for win32 systems. 206 207 --basic 208 (HTTP) Tells curl to use HTTP Basic authentication. This is the 209 default and this option is usually pointless, unless you use it 210 to override a previously set option that sets a different 211 authentication method (such as --ntlm, --digest, or --negoti- 212 ate). 213 214 -c, --cookie-jar <file name> 215 (HTTP) Specify to which file you want curl to write all cookies 216 after a completed operation. Curl writes all cookies previously 217 read from a specified file as well as all cookies received from 218 remote server(s). If no cookies are known, no file will be writ- 219 ten. The file will be written using the Netscape cookie file 220 format. If you set the file name to a single dash, "-", the 221 cookies will be written to stdout. 222 223 This command line option will activate the cookie engine that 224 makes curl record and use cookies. Another way to activate it is 225 to use the -b, --cookie option. 226 227 If the cookie jar can't be created or written to, the whole curl 228 operation won't fail or even report an error clearly. Using -v 229 will get a warning displayed, but that is the only visible feed- 230 back you get about this possibly lethal situation. 231 232 If this option is used several times, the last specified file 233 name will be used. 234 235 -C, --continue-at <offset> 236 Continue/Resume a previous file transfer at the given offset. 237 The given offset is the exact number of bytes that will be 238 skipped, counting from the beginning of the source file before 239 it is transferred to the destination. If used with uploads, the 240 FTP server command SIZE will not be used by curl. 241 242 Use "-C -" to tell curl to automatically find out where/how to 243 resume the transfer. It then uses the given output/input files 244 to figure that out. 245 246 If this option is used several times, the last one will be used. 247 248 --ciphers <list of ciphers> 249 (SSL) Specifies which ciphers to use in the connection. The list 250 of ciphers must specify valid ciphers. Read up on SSL cipher 251 list details on this URL: 252 http://www.openssl.org/docs/apps/ciphers.html 253 254 NSS ciphers are done differently than OpenSSL and GnuTLS. The 255 full list of NSS ciphers is in the NSSCipherSuite entry at this 256 URL: http://git.fedora- 257 hosted.org/cgit/mod_nss.git/plain/docs/mod_nss.html#Directives 258 259 If this option is used several times, the last one will be used. 260 261 --compressed 262 (HTTP) Request a compressed response using one of the algorithms 263 curl supports, and save the uncompressed document. If this 264 option is used and the server sends an unsupported encoding, 265 curl will report an error. 266 267 --connect-timeout <seconds> 268 Maximum time in seconds that you allow the connection to the 269 server to take. This only limits the connection phase, once 270 curl has connected this option is of no more use. Since 7.32.0, 271 this option accepts decimal values, but the actual timeout will 272 decrease in accuracy as the specified timeout increases in deci- 273 mal precision. See also the -m, --max-time option. 274 275 If this option is used several times, the last one will be used. 276 277 --create-dirs 278 When used in conjunction with the -o option, curl will create 279 the necessary local directory hierarchy as needed. This option 280 creates the dirs mentioned with the -o option, nothing else. If 281 the -o file name uses no dir or if the dirs it mentions already 282 exist, no dir will be created. 283 284 To create remote directories when using FTP or SFTP, try --ftp- 285 create-dirs. 286 287 --crlf (FTP) Convert LF to CRLF in upload. Useful for MVS (OS/390). 288 289 --crlfile <file> 290 (HTTPS/FTPS) Provide a file using PEM format with a Certificate 291 Revocation List that may specify peer certificates that are to 292 be considered revoked. 293 294 If this option is used several times, the last one will be used. 295 296 (Added in 7.19.7) 297 -d, --data <data> 298 (HTTP) Sends the specified data in a POST request to the HTTP 299 server, in the same way that a browser does when a user has 300 filled in an HTML form and presses the submit button. This will 301 cause curl to pass the data to the server using the content-type 302 application/x-www-form-urlencoded. Compare to -F, --form. 303 304 -d, --data is the same as --data-ascii. To post data purely 305 binary, you should instead use the --data-binary option. To URL- 306 encode the value of a form field you may use --data-urlencode. 307 308 If any of these options is used more than once on the same com- 309 mand line, the data pieces specified will be merged together 310 with a separating &-symbol. Thus, using '-d name=daniel -d 311 skill=lousy' would generate a post chunk that looks like 312 'name=daniel&skill=lousy'. 313 314 If you start the data with the letter @, the rest should be a 315 file name to read the data from, or - if you want curl to read 316 the data from stdin. Multiple files can also be specified. Post- 317 ing data from a file named 'foobar' would thus be done with 318 --data @foobar. When --data is told to read from a file like 319 that, carriage returns and newlines will be stripped out. 320 321 -D, --dump-header <file> 322 Write the protocol headers to the specified file. 323 324 This option is handy to use when you want to store the headers 325 that an HTTP site sends to you. Cookies from the headers could 326 then be read in a second curl invocation by using the -b, 327 --cookie option! The -c, --cookie-jar option is however a better 328 way to store cookies. 329 330 When used in FTP, the FTP server response lines are considered 331 being "headers" and thus are saved there. 332 333 If this option is used several times, the last one will be used. 334 335 --data-ascii <data> 336 See -d, --data. 337 338 --data-binary <data> 339 (HTTP) This posts data exactly as specified with no extra pro- 340 cessing whatsoever. 341 342 If you start the data with the letter @, the rest should be a 343 filename. Data is posted in a similar manner as --data-ascii 344 does, except that newlines and carriage returns are preserved 345 and conversions are never done. 346 347 If this option is used several times, the ones following the 348 first will append data as described in -d, --data. 349 350 --data-urlencode <data> 351 (HTTP) This posts data, similar to the other --data options with 352 the exception that this performs URL-encoding. (Added in 7.18.0) 353 To be CGI-compliant, the <data> part should begin with a name 354 followed by a separator and a content specification. The <data> 355 part can be passed to curl using one of the following syntaxes: 356 357 content 358 This will make curl URL-encode the content and pass that 359 on. Just be careful so that the content doesn't contain 360 any = or @ symbols, as that will then make the syntax 361 match one of the other cases below! 362 363 =content 364 This will make curl URL-encode the content and pass that 365 on. The preceding = symbol is not included in the data. 366 367 name=content 368 This will make curl URL-encode the content part and pass 369 that on. Note that the name part is expected to be URL- 370 encoded already. 371 372 @filename 373 This will make curl load data from the given file 374 (including any newlines), URL-encode that data and pass 375 it on in the POST. 376 377 name@filename 378 This will make curl load data from the given file 379 (including any newlines), URL-encode that data and pass 380 it on in the POST. The name part gets an equal sign 381 appended, resulting in name=urlencoded-file-content. Note 382 that the name is expected to be URL-encoded already. 383 384 --delegation LEVEL 385 Set LEVEL to tell the server what it is allowed to delegate when 386 it comes to user credentials. Used with GSS/kerberos. 387 388 none Don't allow any delegation. 389 390 policy Delegates if and only if the OK-AS-DELEGATE flag is set 391 in the Kerberos service ticket, which is a matter of 392 realm policy. 393 394 always Unconditionally allow the server to delegate. 395 396 --digest 397 (HTTP) Enables HTTP Digest authentication. This is an authenti- 398 cation scheme that prevents the password from being sent over 399 the wire in clear text. Use this in combination with the normal 400 -u, --user option to set user name and password. See also 401 --ntlm, --negotiate and --anyauth for related options. 402 403 If this option is used several times, only the first one is 404 used. 405 406 --disable-eprt 407 (FTP) Tell curl to disable the use of the EPRT and LPRT commands 408 when doing active FTP transfers. Curl will normally always first 409 attempt to use EPRT, then LPRT before using PORT, but with this 410 option, it will use PORT right away. EPRT and LPRT are exten- 411 sions to the original FTP protocol, and may not work on all 412 servers, but they enable more functionality in a better way than 413 the traditional PORT command. 414 415 --eprt can be used to explicitly enable EPRT again and --no-eprt 416 is an alias for --disable-eprt. 417 418 Disabling EPRT only changes the active behavior. If you want to 419 switch to passive mode you need to not use -P, --ftp-port or 420 force it with --ftp-pasv. 421 422 --disable-epsv 423 (FTP) Tell curl to disable the use of the EPSV command when 424 doing passive FTP transfers. Curl will normally always first 425 attempt to use EPSV before PASV, but with this option, it will 426 not try using EPSV. 427 428 --epsv can be used to explicitly enable EPSV again and --no-epsv 429 is an alias for --disable-epsv. 430 431 Disabling EPSV only changes the passive behavior. If you want to 432 switch to active mode you need to use -P, --ftp-port. 433 434 --dns-interface <interface> 435 Tell curl to send outgoing DNS requests through <interface>. 436 This option is a counterpart to --interface (which does not 437 affect DNS). The supplied string must be an interface name (not 438 an address). 439 440 This option requires that libcurl was built with a resolver 441 backend that supports this operation. The c-ares backend is the 442 only such one. (Added in 7.33.0) 443 444 --dns-ipv4-addr <ip-address> 445 Tell curl to bind to <ip-address> when making IPv4 DNS requests, 446 so that the DNS requests originate from this address. The argu- 447 ment should be a single IPv4 address. 448 449 This option requires that libcurl was built with a resolver 450 backend that supports this operation. The c-ares backend is the 451 only such one. (Added in 7.33.0) 452 453 --dns-ipv6-addr <ip-address> 454 Tell curl to bind to <ip-address> when making IPv6 DNS requests, 455 so that the DNS requests originate from this address. The argu- 456 ment should be a single IPv6 address. 457 458 This option requires that libcurl was built with a resolver 459 backend that supports this operation. The c-ares backend is the 460 only such one. (Added in 7.33.0) 461 462 --dns-servers <ip-address,ip-address> 463 Set the list of DNS servers to be used instead of the system 464 default. The list of IP addresses should be separated with com- 465 mas. Port numbers may also optionally be given as :<port-number> 466 after each IP address. 467 468 This option requires that libcurl was built with a resolver 469 backend that supports this operation. The c-ares backend is the 470 only such one. (Added in 7.33.0) 471 472 -e, --referer <URL> 473 (HTTP) Sends the "Referer Page" information to the HTTP server. 474 This can also be set with the -H, --header flag of course. When 475 used with -L, --location you can append ";auto" to the --referer 476 URL to make curl automatically set the previous URL when it fol- 477 lows a Location: header. The ";auto" string can be used alone, 478 even if you don't set an initial --referer. 479 480 If this option is used several times, the last one will be used. 481 482 -E, --cert <certificate[:password]> 483 (SSL) Tells curl to use the specified client certificate file 484 when getting a file with HTTPS, FTPS or another SSL-based proto- 485 col. The certificate must be in PKCS#12 format if using Secure 486 Transport, or PEM format if using any other engine. If the 487 optional password isn't specified, it will be queried for on the 488 terminal. Note that this option assumes a "certificate" file 489 that is the private key and the private certificate concate- 490 nated! See --cert and --key to specify them independently. 491 492 If curl is built against the NSS SSL library then this option 493 can tell curl the nickname of the certificate to use within the 494 NSS database defined by the environment variable SSL_DIR (or by 495 default /etc/pki/nssdb). If the NSS PEM PKCS#11 module (lib- 496 nsspem.so) is available then PEM files may be loaded. If you 497 want to use a file from the current directory, please precede it 498 with "./" prefix, in order to avoid confusion with a nickname. 499 If the nickname contains ":", it needs to be preceded by "\" so 500 that it is not recognized as password delimiter. If the nick- 501 name contains "\", it needs to be escaped as "\\" so that it is 502 not recognized as an escape character. 503 504 (iOS and Mac OS X only) If curl is built against Secure Trans- 505 port, then the certificate string can either be the name of a 506 certificate/private key in the system or user keychain, or the 507 path to a PKCS#12-encoded certificate and private key. If you 508 want to use a file from the current directory, please precede it 509 with "./" prefix, in order to avoid confusion with a nickname. 510 511 If this option is used several times, the last one will be used. 512 513 --engine <name> 514 Select the OpenSSL crypto engine to use for cipher operations. 515 Use --engine list to print a list of build-time supported 516 engines. Note that not all (or none) of the engines may be 517 available at run-time. 518 519 --environment 520 (RISC OS ONLY) Sets a range of environment variables, using the 521 names the -w option supports, to allow easier extraction of use- 522 ful information after having run curl. 523 524 --egd-file <file> 525 (SSL) Specify the path name to the Entropy Gathering Daemon 526 socket. The socket is used to seed the random engine for SSL 527 connections. See also the --random-file option. 528 529 --cert-type <type> 530 (SSL) Tells curl what certificate type the provided certificate 531 is in. PEM, DER and ENG are recognized types. If not specified, 532 PEM is assumed. 533 534 If this option is used several times, the last one will be used. 535 536 --cacert <CA certificate> 537 (SSL) Tells curl to use the specified certificate file to verify 538 the peer. The file may contain multiple CA certificates. The 539 certificate(s) must be in PEM format. Normally curl is built to 540 use a default file for this, so this option is typically used to 541 alter that default file. 542 543 curl recognizes the environment variable named 'CURL_CA_BUNDLE' 544 if it is set, and uses the given path as a path to a CA cert 545 bundle. This option overrides that variable. 546 547 The windows version of curl will automatically look for a CA 548 certs file named 'curl-ca-bundle.crt', either in the same direc- 549 tory as curl.exe, or in the Current Working Directory, or in any 550 folder along your PATH. 551 552 If curl is built against the NSS SSL library, the NSS PEM 553 PKCS#11 module (libnsspem.so) needs to be available for this 554 option to work properly. 555 556 If this option is used several times, the last one will be used. 557 558 --capath <CA certificate directory> 559 (SSL) Tells curl to use the specified certificate directory to 560 verify the peer. Multiple paths can be provided by separating 561 them with ":" (e.g. "path1:path2:path3"). The certificates must 562 be in PEM format, and if curl is built against OpenSSL, the 563 directory must have been processed using the c_rehash utility 564 supplied with OpenSSL. Using --capath can allow OpenSSL-powered 565 curl to make SSL-connections much more efficiently than using 566 --cacert if the --cacert file contains many CA certificates. 567 568 If this option is set, the default capath value will be ignored, 569 and if it is used several times, the last one will be used. 570 571 -f, --fail 572 (HTTP) Fail silently (no output at all) on server errors. This 573 is mostly done to better enable scripts etc to better deal with 574 failed attempts. In normal cases when an HTTP server fails to 575 deliver a document, it returns an HTML document stating so 576 (which often also describes why and more). This flag will pre- 577 vent curl from outputting that and return error 22. 578 579 This method is not fail-safe and there are occasions where non- 580 successful response codes will slip through, especially when 581 authentication is involved (response codes 401 and 407). 582 583 -F, --form <name=content> 584 (HTTP) This lets curl emulate a filled-in form in which a user 585 has pressed the submit button. This causes curl to POST data 586 using the Content-Type multipart/form-data according to RFC 587 2388. This enables uploading of binary files etc. To force the 588 'content' part to be a file, prefix the file name with an @ 589 sign. To just get the content part from a file, prefix the file 590 name with the symbol <. The difference between @ and < is then 591 that @ makes a file get attached in the post as a file upload, 592 while the < makes a text field and just get the contents for 593 that text field from a file. 594 595 Example, to send your password file to the server, where 'pass- 596 word' is the name of the form-field to which /etc/passwd will be 597 the input: 598 599 curl -F password=@/etc/passwd www.mypasswords.com 600 601 To read content from stdin instead of a file, use - as the file- 602 name. This goes for both @ and < constructs. 603 604 You can also tell curl what Content-Type to use by using 605 'type=', in a manner similar to: 606 607 curl -F "web=@index.html;type=text/html" url.com 608 609 or 610 611 curl -F "name=daniel;type=text/foo" url.com 612 613 You can also explicitly change the name field of a file upload 614 part by setting filename=, like this: 615 616 curl -F "file=@localfile;filename=nameinpost" url.com 617 618 If filename/path contains ',' or ';', it must be quoted by dou- 619 ble-quotes like: 620 621 curl -F "file=@\"localfile\";filename=\"nameinpost\"" url.com 622 623 or 624 625 curl -F 'file=@"localfile";filename="nameinpost"' url.com 626 627 Note that if a filename/path is quoted by double-quotes, any 628 double-quote or backslash within the filename must be escaped by 629 backslash. 630 631 See further examples and details in the MANUAL. 632 633 This option can be used multiple times. 634 635 --ftp-account [data] 636 (FTP) When an FTP server asks for "account data" after user name 637 and password has been provided, this data is sent off using the 638 ACCT command. (Added in 7.13.0) 639 640 If this option is used several times, the last one will be used. 641 642 --ftp-alternative-to-user <command> 643 (FTP) If authenticating with the USER and PASS commands fails, 644 send this command. When connecting to Tumbleweed's Secure 645 Transport server over FTPS using a client certificate, using 646 "SITE AUTH" will tell the server to retrieve the username from 647 the certificate. (Added in 7.15.5) 648 649 --ftp-create-dirs 650 (FTP/SFTP) When an FTP or SFTP URL/operation uses a path that 651 doesn't currently exist on the server, the standard behavior of 652 curl is to fail. Using this option, curl will instead attempt to 653 create missing directories. 654 655 --ftp-method [method] 656 (FTP) Control what method curl should use to reach a file on an 657 FTP(S) server. The method argument should be one of the follow- 658 ing alternatives: 659 660 multicwd 661 curl does a single CWD operation for each path part in 662 the given URL. For deep hierarchies this means very many 663 commands. This is how RFC 1738 says it should be done. 664 This is the default but the slowest behavior. 665 666 nocwd curl does no CWD at all. curl will do SIZE, RETR, STOR 667 etc and give a full path to the server for all these com- 668 mands. This is the fastest behavior. 669 670 singlecwd 671 curl does one CWD with the full target directory and then 672 operates on the file "normally" (like in the multicwd 673 case). This is somewhat more standards compliant than 674 'nocwd' but without the full penalty of 'multicwd'. 675 (Added in 7.15.1) 676 677 --ftp-pasv 678 (FTP) Use passive mode for the data connection. Passive is the 679 internal default behavior, but using this option can be used to 680 override a previous -P/-ftp-port option. (Added in 7.11.0) 681 682 If this option is used several times, only the first one is 683 used. Undoing an enforced passive really isn't doable but you 684 must then instead enforce the correct -P, --ftp-port again. 685 686 Passive mode means that curl will try the EPSV command first and 687 then PASV, unless --disable-epsv is used. 688 689 --ftp-skip-pasv-ip 690 (FTP) Tell curl to not use the IP address the server suggests in 691 its response to curl's PASV command when curl connects the data 692 connection. Instead curl will re-use the same IP address it 693 already uses for the control connection. (Added in 7.14.2) 694 695 This option has no effect if PORT, EPRT or EPSV is used instead 696 of PASV. 697 698 --ftp-pret 699 (FTP) Tell curl to send a PRET command before PASV (and EPSV). 700 Certain FTP servers, mainly drftpd, require this non-standard 701 command for directory listings as well as up and downloads in 702 PASV mode. (Added in 7.20.x) 703 704 --ftp-ssl-ccc 705 (FTP) Use CCC (Clear Command Channel) Shuts down the SSL/TLS 706 layer after authenticating. The rest of the control channel com- 707 munication will be unencrypted. This allows NAT routers to fol- 708 low the FTP transaction. The default mode is passive. See --ftp- 709 ssl-ccc-mode for other modes. (Added in 7.16.1) 710 711 --ftp-ssl-ccc-mode [active/passive] 712 (FTP) Use CCC (Clear Command Channel) Sets the CCC mode. The 713 passive mode will not initiate the shutdown, but instead wait 714 for the server to do it, and will not reply to the shutdown from 715 the server. The active mode initiates the shutdown and waits for 716 a reply from the server. (Added in 7.16.2) 717 718 --ftp-ssl-control 719 (FTP) Require SSL/TLS for the FTP login, clear for transfer. 720 Allows secure authentication, but non-encrypted data transfers 721 for efficiency. Fails the transfer if the server doesn't sup- 722 port SSL/TLS. (Added in 7.16.0) that can still be used but will 723 be removed in a future version. 724 725 --form-string <name=string> 726 (HTTP) Similar to --form except that the value string for the 727 named parameter is used literally. Leading '@' and '<' charac- 728 ters, and the ';type=' string in the value have no special mean- 729 ing. Use this in preference to --form if there's any possibility 730 that the string value may accidentally trigger the '@' or '<' 731 features of --form. 732 733 -g, --globoff 734 This option switches off the "URL globbing parser". When you set 735 this option, you can specify URLs that contain the letters {}[] 736 without having them being interpreted by curl itself. Note that 737 these letters are not normal legal URL contents but they should 738 be encoded according to the URI standard. 739 740 -G, --get 741 When used, this option will make all data specified with -d, 742 --data, --data-binary or --data-urlencode to be used in an HTTP 743 GET request instead of the POST request that otherwise would be 744 used. The data will be appended to the URL with a '?' separator. 745 If used in combination with -I, the POST data will instead be 746 appended to the URL with a HEAD request. 747 748 If this option is used several times, only the first one is 749 used. This is because undoing a GET doesn't make sense, but you 750 should then instead enforce the alternative method you prefer. 751 752 -H, --header <header> 753 (HTTP) Extra header to use when getting a web page. You may 754 specify any number of extra headers. Note that if you should add 755 a custom header that has the same name as one of the internal 756 ones curl would use, your externally set header will be used 757 instead of the internal one. This allows you to make even trick- 758 ier stuff than curl would normally do. You should not replace 759 internally set headers without knowing perfectly well what 760 you're doing. Remove an internal header by giving a replacement 761 without content on the right side of the colon, as in: -H 762 "Host:". If you send the custom header with no-value then its 763 header must be terminated with a semicolon, such as -H "X-Cus- 764 tom-Header;" to send "X-Custom-Header:". 765 766 curl will make sure that each header you add/replace is sent 767 with the proper end-of-line marker, you should thus not add that 768 as a part of the header content: do not add newlines or carriage 769 returns, they will only mess things up for you. 770 771 See also the -A, --user-agent and -e, --referer options. 772 773 This option can be used multiple times to add/replace/remove 774 multiple headers. 775 776 --hostpubmd5 <md5> 777 (SCP/SFTP) Pass a string containing 32 hexadecimal digits. The 778 string should be the 128 bit MD5 checksum of the remote host's 779 public key, curl will refuse the connection with the host unless 780 the md5sums match. (Added in 7.17.1) 781 782 --ignore-content-length 783 (HTTP) Ignore the Content-Length header. This is particularly 784 useful for servers running Apache 1.x, which will report incor- 785 rect Content-Length for files larger than 2 gigabytes. 786 787 -i, --include 788 (HTTP) Include the HTTP-header in the output. The HTTP-header 789 includes things like server-name, date of the document, HTTP- 790 version and more... 791 792 -I, --head 793 (HTTP/FTP/FILE) Fetch the HTTP-header only! HTTP-servers feature 794 the command HEAD which this uses to get nothing but the header 795 of a document. When used on an FTP or FILE file, curl displays 796 the file size and last modification time only. 797 798 --interface <name> 799 Perform an operation using a specified interface. You can enter 800 interface name, IP address or host name. An example could look 801 like: 802 803 curl --interface eth0:1 http://www.netscape.com/ 804 805 If this option is used several times, the last one will be used. 806 807 -j, --junk-session-cookies 808 (HTTP) When curl is told to read cookies from a given file, this 809 option will make it discard all "session cookies". This will 810 basically have the same effect as if a new session is started. 811 Typical browsers always discard session cookies when they're 812 closed down. 813 814 -J, --remote-header-name 815 (HTTP) This option tells the -O, --remote-name option to use the 816 server-specified Content-Disposition filename instead of 817 extracting a filename from the URL. 818 819 There's no attempt to decode %-sequences (yet) in the provided 820 file name, so this option may provide you with rather unexpected 821 file names. 822 823 -k, --insecure 824 (SSL) This option explicitly allows curl to perform "insecure" 825 SSL connections and transfers. All SSL connections are attempted 826 to be made secure by using the CA certificate bundle installed 827 by default. This makes all connections considered "insecure" 828 fail unless -k, --insecure is used. 829 830 See this online resource for further details: 831 http://curl.haxx.se/docs/sslcerts.html 832 833 -K, --config <config file> 834 Specify which config file to read curl arguments from. The con- 835 fig file is a text file in which command line arguments can be 836 written which then will be used as if they were written on the 837 actual command line. 838 839 Options and their parameters must be specified on the same con- 840 fig file line, separated by whitespace, colon, or the equals 841 sign. Long option names can optionally be given in the config 842 file without the initial double dashes and if so, the colon or 843 equals characters can be used as separators. If the option is 844 specified with one or two dashes, there can be no colon or 845 equals character between the option and its parameter. 846 847 If the parameter is to contain whitespace, the parameter must be 848 enclosed within quotes. Within double quotes, the following 849 escape sequences are available: \\, \", \t, \n, \r and \v. A 850 backslash preceding any other letter is ignored. If the first 851 column of a config line is a '#' character, the rest of the line 852 will be treated as a comment. Only write one option per physical 853 line in the config file. 854 855 Specify the filename to -K, --config as '-' to make curl read 856 the file from stdin. 857 858 Note that to be able to specify a URL in the config file, you 859 need to specify it using the --url option, and not by simply 860 writing the URL on its own line. So, it could look similar to 861 this: 862 863 url = "http://curl.haxx.se/docs/" 864 865 When curl is invoked, it always (unless -q is used) checks for a 866 default config file and uses it if found. The default config 867 file is checked for in the following places in this order: 868 869 1) curl tries to find the "home dir": It first checks for the 870 CURL_HOME and then the HOME environment variables. Failing that, 871 it uses getpwuid() on UNIX-like systems (which returns the home 872 dir given the current user in your system). On Windows, it then 873 checks for the APPDATA variable, or as a last resort the '%USER- 874 PROFILE%\Application Data'. 875 876 2) On windows, if there is no _curlrc file in the home dir, it 877 checks for one in the same dir the curl executable is placed. On 878 UNIX-like systems, it will simply try to load .curlrc from the 879 determined home dir. 880 881 # --- Example file --- 882 # this is a comment 883 url = "curl.haxx.se" 884 output = "curlhere.html" 885 user-agent = "superagent/1.0" 886 887 # and fetch another URL too 888 url = "curl.haxx.se/docs/manpage.html" 889 -O 890 referer = "http://nowhereatall.com/" 891 # --- End of example file --- 892 893 This option can be used multiple times to load multiple config 894 files. 895 896 --keepalive-time <seconds> 897 This option sets the time a connection needs to remain idle 898 before sending keepalive probes and the time between individual 899 keepalive probes. It is currently effective on operating systems 900 offering the TCP_KEEPIDLE and TCP_KEEPINTVL socket options 901 (meaning Linux, recent AIX, HP-UX and more). This option has no 902 effect if --no-keepalive is used. (Added in 7.18.0) 903 904 If this option is used several times, the last one will be used. 905 If unspecified, the option defaults to 60 seconds. 906 907 --key <key> 908 (SSL/SSH) Private key file name. Allows you to provide your pri- 909 vate key in this separate file. 910 911 If this option is used several times, the last one will be used. 912 913 --key-type <type> 914 (SSL) Private key file type. Specify which type your --key pro- 915 vided private key is. DER, PEM, and ENG are supported. If not 916 specified, PEM is assumed. 917 918 If this option is used several times, the last one will be used. 919 920 --krb <level> 921 (FTP) Enable Kerberos authentication and use. The level must be 922 entered and should be one of 'clear', 'safe', 'confidential', or 923 'private'. Should you use a level that is not one of these, 924 'private' will instead be used. 925 926 This option requires a library built with kerberos4 or GSSAPI 927 (GSS-Negotiate) support. This is not very common. Use -V, --ver- 928 sion to see if your curl supports it. 929 930 If this option is used several times, the last one will be used. 931 932 -l, --list-only 933 (FTP) When listing an FTP directory, this switch forces a name- 934 only view. This is especially useful if the user wants to 935 machine-parse the contents of an FTP directory since the normal 936 directory view doesn't use a standard look or format. When used 937 like this, the option causes a NLST command to be sent to the 938 server instead of LIST. 939 940 Note: Some FTP servers list only files in their response to 941 NLST; they do not include sub-directories and symbolic links. 942 943 (POP3) When retrieving a specific email from POP3, this switch 944 forces a LIST command to be performed instead of RETR. This is 945 particularly useful if the user wants to see if a specific mes- 946 sage id exists on the server and what size it is. 947 948 Note: When combined with -X, --request <command>, this option 949 can be used to send an UIDL command instead, so the user may use 950 the email's unique identifier rather than it's message id to 951 make the request. (Added in 7.21.5) 952 953 -L, --location 954 (HTTP/HTTPS) If the server reports that the requested page has 955 moved to a different location (indicated with a Location: header 956 and a 3XX response code), this option will make curl redo the 957 request on the new place. If used together with -i, --include or 958 -I, --head, headers from all requested pages will be shown. When 959 authentication is used, curl only sends its credentials to the 960 initial host. If a redirect takes curl to a different host, it 961 won't be able to intercept the user+password. See also --loca- 962 tion-trusted on how to change this. You can limit the amount of 963 redirects to follow by using the --max-redirs option. 964 965 When curl follows a redirect and the request is not a plain GET 966 (for example POST or PUT), it will do the following request with 967 a GET if the HTTP response was 301, 302, or 303. If the response 968 code was any other 3xx code, curl will re-send the following 969 request using the same unmodified method. 970 971 --libcurl <file> 972 Append this option to any ordinary curl command line, and you 973 will get a libcurl-using C source code written to the file that 974 does the equivalent of what your command-line operation does! 975 976 If this option is used several times, the last given file name 977 will be used. (Added in 7.16.1) 978 979 --limit-rate <speed> 980 Specify the maximum transfer rate you want curl to use. This 981 feature is useful if you have a limited pipe and you'd like your 982 transfer not to use your entire bandwidth. 983 984 The given speed is measured in bytes/second, unless a suffix is 985 appended. Appending 'k' or 'K' will count the number as kilo- 986 bytes, 'm' or M' makes it megabytes, while 'g' or 'G' makes it 987 gigabytes. Examples: 200K, 3m and 1G. 988 989 The given rate is the average speed counted during the entire 990 transfer. It means that curl might use higher transfer speeds in 991 short bursts, but over time it uses no more than the given rate. 992 If you also use the -Y, --speed-limit option, that option will 993 take precedence and might cripple the rate-limiting slightly, to 994 help keeping the speed-limit logic working. 995 996 If this option is used several times, the last one will be used. 997 998 --local-port <num>[-num] 999 Set a preferred number or range of local port numbers to use for 1000 the connection(s). Note that port numbers by nature are a 1001 scarce resource that will be busy at times so setting this range 1002 to something too narrow might cause unnecessary connection setup 1003 failures. (Added in 7.15.2) 1004 1005 --location-trusted 1006 (HTTP/HTTPS) Like -L, --location, but will allow sending the 1007 name + password to all hosts that the site may redirect to. This 1008 may or may not introduce a security breach if the site redirects 1009 you to a site to which you'll send your authentication info 1010 (which is plaintext in the case of HTTP Basic authentication). 1011 1012 -m, --max-time <seconds> 1013 Maximum time in seconds that you allow the whole operation to 1014 take. This is useful for preventing your batch jobs from hang- 1015 ing for hours due to slow networks or links going down. Since 1016 7.32.0, this option accepts decimal values, but the actual time- 1017 out will decrease in accuracy as the specified timeout increases 1018 in decimal precision. See also the --connect-timeout option. 1019 1020 If this option is used several times, the last one will be used. 1021 1022 --mail-auth <address> 1023 (SMTP) Specify a single address. This will be used to specify 1024 the authentication address (identity) of a submitted message 1025 that is being relayed to another server. 1026 1027 (Added in 7.25.0) 1028 1029 --mail-from <address> 1030 (SMTP) Specify a single address that the given mail should get 1031 sent from. 1032 1033 (Added in 7.20.0) 1034 1035 --max-filesize <bytes> 1036 Specify the maximum size (in bytes) of a file to download. If 1037 the file requested is larger than this value, the transfer will 1038 not start and curl will return with exit code 63. 1039 1040 NOTE: The file size is not always known prior to download, and 1041 for such files this option has no effect even if the file trans- 1042 fer ends up being larger than this given limit. This concerns 1043 both FTP and HTTP transfers. 1044 1045 --mail-rcpt <address> 1046 (SMTP) Specify a single address, user name or mailing list name. 1047 When performing a mail transfer, the recipient should specify a 1048 valid email address to send the mail to. (Added in 7.20.0) 1049 1050 When performing an address verification (VRFY command), the 1051 recipient should be specified as the user name or user name and 1052 domain (as per Section 3.5 of RFC5321). (Added in 7.34.0) 1053 1054 When performing a mailing list expand (EXPN command), the recip- 1055 ient should be specified using the mailing list name, such as 1056 "Friends" or "London-Office". (Added in 7.34.0) 1057 1058 --max-redirs <num> 1059 Set maximum number of redirection-followings allowed. If -L, 1060 --location is used, this option can be used to prevent curl from 1061 following redirections "in absurdum". By default, the limit is 1062 set to 50 redirections. Set this option to -1 to make it limit- 1063 less. 1064 1065 If this option is used several times, the last one will be used. 1066 1067 --metalink 1068 This option can tell curl to parse and process a given URI as 1069 Metalink file (both version 3 and 4 (RFC 5854) are supported) 1070 and make use of the mirrors listed within for failover if there 1071 are errors (such as the file or server not being available). It 1072 will also verify the hash of the file after the download com- 1073 pletes. The Metalink file itself is downloaded and processed in 1074 memory and not stored in the local file system. 1075 1076 Example to use a remote Metalink file: 1077 1078 curl --metalink http://www.example.com/example.metalink 1079 1080 To use a Metalink file in the local file system, use FILE proto- 1081 col (file://): 1082 1083 curl --metalink file://example.metalink 1084 1085 Please note that if FILE protocol is disabled, there is no way 1086 to use a local Metalink file at the time of this writing. Also 1087 note that if --metalink and --include are used together, 1088 --include will be ignored. This is because including headers in 1089 the response will break Metalink parser and if the headers are 1090 included in the file described in Metalink file, hash check will 1091 fail. 1092 1093 (Added in 7.27.0, if built against the libmetalink library.) 1094 1095 -n, --netrc 1096 Makes curl scan the .netrc (_netrc on Windows) file in the 1097 user's home directory for login name and password. This is typi- 1098 cally used for FTP on UNIX. If used with HTTP, curl will enable 1099 user authentication. See netrc(4) or ftp(1) for details on the 1100 file format. Curl will not complain if that file doesn't have 1101 the right permissions (it should not be either world- or group- 1102 readable). The environment variable "HOME" is used to find the 1103 home directory. 1104 1105 A quick and very simple example of how to setup a .netrc to 1106 allow curl to FTP to the machine host.domain.com with user name 1107 'myself' and password 'secret' should look similar to: 1108 1109 machine host.domain.com login myself password secret 1110 1111 -N, --no-buffer 1112 Disables the buffering of the output stream. In normal work sit- 1113 uations, curl will use a standard buffered output stream that 1114 will have the effect that it will output the data in chunks, not 1115 necessarily exactly when the data arrives. Using this option 1116 will disable that buffering. 1117 1118 Note that this is the negated option name documented. You can 1119 thus use --buffer to enforce the buffering. 1120 1121 --netrc-file 1122 This option is similar to --netrc, except that you provide the 1123 path (absolute or relative) to the netrc file that Curl should 1124 use. You can only specify one netrc file per invocation. If 1125 several --netrc-file options are provided, only the last one 1126 will be used. (Added in 7.21.5) 1127 1128 This option overrides any use of --netrc as they are mutually 1129 exclusive. It will also abide by --netrc-optional if specified. 1130 1131 --netrc-optional 1132 Very similar to --netrc, but this option makes the .netrc usage 1133 optional and not mandatory as the --netrc option does. 1134 1135 --negotiate 1136 (HTTP) Enables GSS-Negotiate authentication. The GSS-Negotiate 1137 method was designed by Microsoft and is used in their web appli- 1138 cations. It is primarily meant as a support for Kerberos5 1139 authentication but may be also used along with another authenti- 1140 cation method. For more information see IETF draft draft-brezak- 1141 spnego-http-04.txt. 1142 1143 If you want to enable Negotiate for your proxy authentication, 1144 then use --proxy-negotiate. 1145 1146 This option requires a library built with GSSAPI support. This 1147 is not very common. Use -V, --version to see if your version 1148 supports GSS-Negotiate. 1149 1150 When using this option, you must also provide a fake -u, --user 1151 option to activate the authentication code properly. Sending a 1152 '-u :' is enough as the user name and password from the -u 1153 option aren't actually used. 1154 1155 If this option is used several times, only the first one is 1156 used. 1157 1158 --no-keepalive 1159 Disables the use of keepalive messages on the TCP connection, as 1160 by default curl enables them. 1161 1162 Note that this is the negated option name documented. You can 1163 thus use --keepalive to enforce keepalive. 1164 1165 --no-sessionid 1166 (SSL) Disable curl's use of SSL session-ID caching. By default 1167 all transfers are done using the cache. Note that while nothing 1168 should ever get hurt by attempting to reuse SSL session-IDs, 1169 there seem to be broken SSL implementations in the wild that may 1170 require you to disable this in order for you to succeed. (Added 1171 in 7.16.0) 1172 1173 Note that this is the negated option name documented. You can 1174 thus use --sessionid to enforce session-ID caching. 1175 1176 --noproxy <no-proxy-list> 1177 Comma-separated list of hosts which do not use a proxy, if one 1178 is specified. The only wildcard is a single * character, which 1179 matches all hosts, and effectively disables the proxy. Each name 1180 in this list is matched as either a domain which contains the 1181 hostname, or the hostname itself. For example, local.com would 1182 match local.com, local.com:80, and www.local.com, but not 1183 www.notlocal.com. (Added in 7.19.4). 1184 1185 --ntlm (HTTP) Enables NTLM authentication. The NTLM authentication 1186 method was designed by Microsoft and is used by IIS web servers. 1187 It is a proprietary protocol, reverse-engineered by clever peo- 1188 ple and implemented in curl based on their efforts. This kind of 1189 behavior should not be endorsed, you should encourage everyone 1190 who uses NTLM to switch to a public and documented authentica- 1191 tion method instead, such as Digest. 1192 1193 If you want to enable NTLM for your proxy authentication, then 1194 use --proxy-ntlm. 1195 1196 This option requires a library built with SSL support. Use -V, 1197 --version to see if your curl supports NTLM. 1198 1199 If this option is used several times, only the first one is 1200 used. 1201 1202 -o, --output <file> 1203 Write output to <file> instead of stdout. If you are using {} or 1204 [] to fetch multiple documents, you can use '#' followed by a 1205 number in the <file> specifier. That variable will be replaced 1206 with the current string for the URL being fetched. Like in: 1207 1208 curl http://{one,two}.site.com -o "file_#1.txt" 1209 1210 or use several variables like: 1211 1212 curl http://{site,host}.host[1-5].com -o "#1_#2" 1213 1214 You may use this option as many times as the number of URLs you 1215 have. 1216 1217 See also the --create-dirs option to create the local directo- 1218 ries dynamically. Specifying the output as '-' (a single dash) 1219 will force the output to be done to stdout. 1220 1221 -O, --remote-name 1222 Write output to a local file named like the remote file we get. 1223 (Only the file part of the remote file is used, the path is cut 1224 off.) 1225 1226 The remote file name to use for saving is extracted from the 1227 given URL, nothing else. 1228 1229 Consequentially, the file will be saved in the current working 1230 directory. If you want the file saved in a different directory, 1231 make sure you change current working directory before you invoke 1232 curl with the -O, --remote-name flag! 1233 1234 There is no URL decoding done on the file name. If it has %20 or 1235 other URL encoded parts of the name, they will end up as-is as 1236 file name. 1237 1238 You may use this option as many times as the number of URLs you 1239 have. 1240 1241 --oauth2-bearer 1242 (IMAP, POP3, SMTP) Specify the Bearer Token for OAUTH 2.0 server 1243 authentication. The Bearer Token is used in conjunction with the 1244 user name which can be specified as part of the --url or -u, 1245 --user options. 1246 1247 The Bearer Token and user name are formatted according to RFC 1248 6750. 1249 1250 If this option is used several times, the last one will be used. 1251 1252 -p, --proxytunnel 1253 When an HTTP proxy is used (-x, --proxy), this option will cause 1254 non-HTTP protocols to attempt to tunnel through the proxy 1255 instead of merely using it to do HTTP-like operations. The tun- 1256 nel approach is made with the HTTP proxy CONNECT request and 1257 requires that the proxy allows direct connect to the remote port 1258 number curl wants to tunnel through to. 1259 1260 -P, --ftp-port <address> 1261 (FTP) Reverses the default initiator/listener roles when con- 1262 necting with FTP. This switch makes curl use active mode. In 1263 practice, curl then tells the server to connect back to the 1264 client's specified address and port, while passive mode asks the 1265 server to setup an IP address and port for it to connect to. 1266 <address> should be one of: 1267 1268 interface 1269 i.e "eth0" to specify which interface's IP address you 1270 want to use (Unix only) 1271 1272 IP address 1273 i.e "192.168.10.1" to specify the exact IP address 1274 1275 host name 1276 i.e "my.host.domain" to specify the machine 1277 1278 - make curl pick the same IP address that is already used 1279 for the control connection 1280 1281 If this option is used several times, the last one will be used. Dis- 1282 able the use of PORT with --ftp-pasv. Disable the attempt to use the 1283 EPRT command instead of PORT by using --disable-eprt. EPRT is really 1284 PORT++. 1285 1286 Starting in 7.19.5, you can append ":[start]-[end]" to the right of the 1287 address, to tell curl what TCP port range to use. That means you spec- 1288 ify a port range, from a lower to a higher number. A single number 1289 works as well, but do note that it increases the risk of failure since 1290 the port may not be available. 1291 1292 --pass <phrase> 1293 (SSL/SSH) Passphrase for the private key 1294 1295 If this option is used several times, the last one will be used. 1296 1297 --post301 1298 (HTTP) Tells curl to respect RFC 2616/10.3.2 and not convert 1299 POST requests into GET requests when following a 301 redirect- 1300 ion. The non-RFC behaviour is ubiquitous in web browsers, so 1301 curl does the conversion by default to maintain consistency. 1302 However, a server may require a POST to remain a POST after such 1303 a redirection. This option is meaningful only when using -L, 1304 --location (Added in 7.17.1) 1305 1306 --post302 1307 (HTTP) Tells curl to respect RFC 2616/10.3.2 and not convert 1308 POST requests into GET requests when following a 302 redirect- 1309 ion. The non-RFC behaviour is ubiquitous in web browsers, so 1310 curl does the conversion by default to maintain consistency. 1311 However, a server may require a POST to remain a POST after such 1312 a redirection. This option is meaningful only when using -L, 1313 --location (Added in 7.19.1) 1314 1315 --post303 1316 (HTTP) Tells curl to respect RFC 2616/10.3.2 and not convert 1317 POST requests into GET requests when following a 303 redirect- 1318 ion. The non-RFC behaviour is ubiquitous in web browsers, so 1319 curl does the conversion by default to maintain consistency. 1320 However, a server may require a POST to remain a POST after such 1321 a redirection. This option is meaningful only when using -L, 1322 --location (Added in 7.26.0) 1323 1324 --proto <protocols> 1325 Tells curl to use the listed protocols for its initial 1326 retrieval. Protocols are evaluated left to right, are comma sep- 1327 arated, and are each a protocol name or 'all', optionally pre- 1328 fixed by zero or more modifiers. Available modifiers are: 1329 1330 + Permit this protocol in addition to protocols already permit- 1331 ted (this is the default if no modifier is used). 1332 1333 - Deny this protocol, removing it from the list of protocols 1334 already permitted. 1335 1336 = Permit only this protocol (ignoring the list already permit- 1337 ted), though subject to later modification by subsequent 1338 entries in the comma separated list. 1339 1340 For example: 1341 1342 --proto -ftps uses the default protocols, but disables ftps 1343 1344 --proto -all,https,+http 1345 only enables http and https 1346 1347 --proto =http,https 1348 also only enables http and https 1349 1350 Unknown protocols produce a warning. This allows scripts to 1351 safely rely on being able to disable potentially dangerous pro- 1352 tocols, without relying upon support for that protocol being 1353 built into curl to avoid an error. 1354 1355 This option can be used multiple times, in which case the effect 1356 is the same as concatenating the protocols into one instance of 1357 the option. 1358 1359 (Added in 7.20.2) 1360 1361 --proto-redir <protocols> 1362 Tells curl to use the listed protocols after a redirect. See 1363 --proto for how protocols are represented. 1364 1365 (Added in 7.20.2) 1366 1367 --proxy-anyauth 1368 Tells curl to pick a suitable authentication method when commu- 1369 nicating with the given proxy. This might cause an extra 1370 request/response round-trip. (Added in 7.13.2) 1371 1372 --proxy-basic 1373 Tells curl to use HTTP Basic authentication when communicating 1374 with the given proxy. Use --basic for enabling HTTP Basic with a 1375 remote host. Basic is the default authentication method curl 1376 uses with proxies. 1377 1378 --proxy-digest 1379 Tells curl to use HTTP Digest authentication when communicating 1380 with the given proxy. Use --digest for enabling HTTP Digest with 1381 a remote host. 1382 1383 --proxy-negotiate 1384 Tells curl to use HTTP Negotiate authentication when communicat- 1385 ing with the given proxy. Use --negotiate for enabling HTTP 1386 Negotiate with a remote host. (Added in 7.17.1) 1387 1388 --proxy-ntlm 1389 Tells curl to use HTTP NTLM authentication when communicating 1390 with the given proxy. Use --ntlm for enabling NTLM with a remote 1391 host. 1392 1393 --proxy1.0 <proxyhost[:port]> 1394 Use the specified HTTP 1.0 proxy. If the port number is not 1395 specified, it is assumed at port 1080. 1396 1397 The only difference between this and the HTTP proxy option (-x, 1398 --proxy), is that attempts to use CONNECT through the proxy will 1399 specify an HTTP 1.0 protocol instead of the default HTTP 1.1. 1400 1401 --pubkey <key> 1402 (SSH) Public key file name. Allows you to provide your public 1403 key in this separate file. 1404 1405 If this option is used several times, the last one will be used. 1406 1407 -q If used as the first parameter on the command line, the curlrc 1408 config file will not be read and used. See the -K, --config for 1409 details on the default config file search path. 1410 1411 -Q, --quote <command> 1412 (FTP/SFTP) Send an arbitrary command to the remote FTP or SFTP 1413 server. Quote commands are sent BEFORE the transfer takes place 1414 (just after the initial PWD command in an FTP transfer, to be 1415 exact). To make commands take place after a successful transfer, 1416 prefix them with a dash '-'. To make commands be sent after 1417 curl has changed the working directory, just before the transfer 1418 command(s), prefix the command with a '+' (this is only sup- 1419 ported for FTP). You may specify any number of commands. If the 1420 server returns failure for one of the commands, the entire oper- 1421 ation will be aborted. You must send syntactically correct FTP 1422 commands as RFC 959 defines to FTP servers, or one of the com- 1423 mands listed below to SFTP servers. This option can be used 1424 multiple times. When speaking to an FTP server, prefix the com- 1425 mand with an asterisk (*) to make curl continue even if the com- 1426 mand fails as by default curl will stop at first failure. 1427 1428 SFTP is a binary protocol. Unlike for FTP, curl interprets SFTP 1429 quote commands itself before sending them to the server. File 1430 names may be quoted shell-style to embed spaces or special char- 1431 acters. Following is the list of all supported SFTP quote com- 1432 mands: 1433 1434 chgrp group file 1435 The chgrp command sets the group ID of the file named by 1436 the file operand to the group ID specified by the group 1437 operand. The group operand is a decimal integer group ID. 1438 1439 chmod mode file 1440 The chmod command modifies the file mode bits of the 1441 specified file. The mode operand is an octal integer mode 1442 number. 1443 1444 chown user file 1445 The chown command sets the owner of the file named by the 1446 file operand to the user ID specified by the user oper- 1447 and. The user operand is a decimal integer user ID. 1448 1449 ln source_file target_file 1450 The ln and symlink commands create a symbolic link at the 1451 target_file location pointing to the source_file loca- 1452 tion. 1453 1454 mkdir directory_name 1455 The mkdir command creates the directory named by the 1456 directory_name operand. 1457 1458 pwd The pwd command returns the absolute pathname of the cur- 1459 rent working directory. 1460 1461 rename source target 1462 The rename command renames the file or directory named by 1463 the source operand to the destination path named by the 1464 target operand. 1465 1466 rm file 1467 The rm command removes the file specified by the file op- 1468 erand. 1469 1470 rmdir directory 1471 The rmdir command removes the directory entry specified 1472 by the directory operand, provided it is empty. 1473 1474 symlink source_file target_file 1475 See ln. 1476 1477 -r, --range <range> 1478 (HTTP/FTP/SFTP/FILE) Retrieve a byte range (i.e a partial docu- 1479 ment) from a HTTP/1.1, FTP or SFTP server or a local FILE. 1480 Ranges can be specified in a number of ways. 1481 1482 0-499 specifies the first 500 bytes 1483 1484 500-999 specifies the second 500 bytes 1485 1486 -500 specifies the last 500 bytes 1487 1488 9500- specifies the bytes from offset 9500 and forward 1489 1490 0-0,-1 specifies the first and last byte only(*)(H) 1491 1492 500-700,600-799 1493 specifies 300 bytes from offset 500(H) 1494 1495 100-199,500-599 1496 specifies two separate 100-byte ranges(*)(H) 1497 1498 (*) = NOTE that this will cause the server to reply with a multipart 1499 response! 1500 1501 Only digit characters (0-9) are valid in the 'start' and 'stop' fields 1502 of the 'start-stop' range syntax. If a non-digit character is given in 1503 the range, the server's response will be unspecified, depending on the 1504 server's configuration. 1505 1506 You should also be aware that many HTTP/1.1 servers do not have this 1507 feature enabled, so that when you attempt to get a range, you'll 1508 instead get the whole document. 1509 1510 FTP and SFTP range downloads only support the simple 'start-stop' syn- 1511 tax (optionally with one of the numbers omitted). FTP use depends on 1512 the extended FTP command SIZE. 1513 1514 If this option is used several times, the last one will be used. 1515 1516 -R, --remote-time 1517 When used, this will make curl attempt to figure out the time- 1518 stamp of the remote file, and if that is available make the 1519 local file get that same timestamp. 1520 1521 --random-file <file> 1522 (SSL) Specify the path name to file containing what will be con- 1523 sidered as random data. The data is used to seed the random 1524 engine for SSL connections. See also the --egd-file option. 1525 1526 --raw (HTTP) When used, it disables all internal HTTP decoding of con- 1527 tent or transfer encodings and instead makes them passed on 1528 unaltered, raw. (Added in 7.16.2) 1529 1530 --remote-name-all 1531 This option changes the default action for all given URLs to be 1532 dealt with as if -O, --remote-name were used for each one. So if 1533 you want to disable that for a specific URL after --remote-name- 1534 all has been used, you must use "-o -" or --no-remote-name. 1535 (Added in 7.19.0) 1536 1537 --resolve <host:port:address> 1538 Provide a custom address for a specific host and port pair. 1539 Using this, you can make the curl requests(s) use a specified 1540 address and prevent the otherwise normally resolved address to 1541 be used. Consider it a sort of /etc/hosts alternative provided 1542 on the command line. The port number should be the number used 1543 for the specific protocol the host will be used for. It means 1544 you need several entries if you want to provide address for the 1545 same host but different ports. 1546 1547 This option can be used many times to add many host names to 1548 resolve. 1549 1550 (Added in 7.21.3) 1551 1552 --retry <num> 1553 If a transient error is returned when curl tries to perform a 1554 transfer, it will retry this number of times before giving up. 1555 Setting the number to 0 makes curl do no retries (which is the 1556 default). Transient error means either: a timeout, an FTP 4xx 1557 response code or an HTTP 5xx response code. 1558 1559 When curl is about to retry a transfer, it will first wait one 1560 second and then for all forthcoming retries it will double the 1561 waiting time until it reaches 10 minutes which then will be the 1562 delay between the rest of the retries. By using --retry-delay 1563 you disable this exponential backoff algorithm. See also 1564 --retry-max-time to limit the total time allowed for retries. 1565 (Added in 7.12.3) 1566 1567 If this option is used several times, the last one will be used. 1568 1569 --retry-delay <seconds> 1570 Make curl sleep this amount of time before each retry when a 1571 transfer has failed with a transient error (it changes the 1572 default backoff time algorithm between retries). This option is 1573 only interesting if --retry is also used. Setting this delay to 1574 zero will make curl use the default backoff time. (Added in 1575 7.12.3) 1576 1577 If this option is used several times, the last one will be used. 1578 1579 --retry-max-time <seconds> 1580 The retry timer is reset before the first transfer attempt. 1581 Retries will be done as usual (see --retry) as long as the timer 1582 hasn't reached this given limit. Notice that if the timer hasn't 1583 reached the limit, the request will be made and while perform- 1584 ing, it may take longer than this given time period. To limit a 1585 single request's maximum time, use -m, --max-time. Set this 1586 option to zero to not timeout retries. (Added in 7.12.3) 1587 1588 If this option is used several times, the last one will be used. 1589 1590 -s, --silent 1591 Silent or quiet mode. Don't show progress meter or error mes- 1592 sages. Makes Curl mute. It will still output the data you ask 1593 for, potentially even to the terminal/stdout unless you redirect 1594 it. 1595 1596 --sasl-ir 1597 Enable initial response in SASL authentication. (Added in 1598 7.31.0) 1599 1600 -S, --show-error 1601 When used with -s it makes curl show an error message if it 1602 fails. 1603 1604 --ssl (FTP, POP3, IMAP, SMTP) Try to use SSL/TLS for the connection. 1605 Reverts to a non-secure connection if the server doesn't support 1606 SSL/TLS. See also --ftp-ssl-control and --ssl-reqd for differ- 1607 ent levels of encryption required. (Added in 7.20.0) 1608 1609 This option was formerly known as --ftp-ssl (Added in 7.11.0). 1610 That option name can still be used but will be removed in a 1611 future version. 1612 1613 --ssl-reqd 1614 (FTP, POP3, IMAP, SMTP) Require SSL/TLS for the connection. 1615 Terminates the connection if the server doesn't support SSL/TLS. 1616 (Added in 7.20.0) 1617 1618 This option was formerly known as --ftp-ssl-reqd (added in 1619 7.15.5). That option name can still be used but will be removed 1620 in a future version. 1621 1622 --ssl-allow-beast 1623 (SSL) This option tells curl to not work around a security flaw 1624 in the SSL3 and TLS1.0 protocols known as BEAST. If this option 1625 isn't used, the SSL layer may use work-arounds known to cause 1626 interoperability problems with some older SSL implementations. 1627 WARNING: this option loosens the SSL security, and by using this 1628 flag you ask for exactly that. (Added in 7.25.0) 1629 1630 --socks4 <host[:port]> 1631 Use the specified SOCKS4 proxy. If the port number is not speci- 1632 fied, it is assumed at port 1080. (Added in 7.15.2) 1633 1634 This option overrides any previous use of -x, --proxy, as they 1635 are mutually exclusive. 1636 1637 Since 7.21.7, this option is superfluous since you can specify a 1638 socks4 proxy with -x, --proxy using a socks4:// protocol prefix. 1639 If this option is used several times, the last one will be used. 1640 1641 --socks4a <host[:port]> 1642 Use the specified SOCKS4a proxy. If the port number is not spec- 1643 ified, it is assumed at port 1080. (Added in 7.18.0) 1644 1645 This option overrides any previous use of -x, --proxy, as they 1646 are mutually exclusive. 1647 1648 Since 7.21.7, this option is superfluous since you can specify a 1649 socks4a proxy with -x, --proxy using a socks4a:// protocol pre- 1650 fix. 1651 1652 If this option is used several times, the last one will be used. 1653 1654 --socks5-hostname <host[:port]> 1655 Use the specified SOCKS5 proxy (and let the proxy resolve the 1656 host name). If the port number is not specified, it is assumed 1657 at port 1080. (Added in 7.18.0) 1658 1659 This option overrides any previous use of -x, --proxy, as they 1660 are mutually exclusive. 1661 1662 Since 7.21.7, this option is superfluous since you can specify a 1663 socks5 hostname proxy with -x, --proxy using a socks5h:// proto- 1664 col prefix. 1665 1666 If this option is used several times, the last one will be used. 1667 (This option was previously wrongly documented and used as 1668 --socks without the number appended.) 1669 1670 --socks5 <host[:port]> 1671 Use the specified SOCKS5 proxy - but resolve the host name 1672 locally. If the port number is not specified, it is assumed at 1673 port 1080. 1674 1675 This option overrides any previous use of -x, --proxy, as they 1676 are mutually exclusive. 1677 1678 Since 7.21.7, this option is superfluous since you can specify a 1679 socks5 proxy with -x, --proxy using a socks5:// protocol prefix. 1680 If this option is used several times, the last one will be used. 1681 (This option was previously wrongly documented and used as 1682 --socks without the number appended.) 1683 1684 This option (as well as --socks4) does not work with IPV6, FTPS 1685 or LDAP. 1686 1687 --socks5-gssapi-service <servicename> 1688 The default service name for a socks server is rcmd/server-fqdn. 1689 This option allows you to change it. 1690 1691 Examples: --socks5 proxy-name --socks5-gssapi-service sockd 1692 would use sockd/proxy-name --socks5 proxy-name --socks5-gssapi- 1693 service sockd/real-name would use sockd/real-name for cases 1694 where the proxy-name does not match the principal name. (Added 1695 in 7.19.4). 1696 1697 --socks5-gssapi-nec 1698 As part of the gssapi negotiation a protection mode is negoti- 1699 ated. RFC 1961 says in section 4.3/4.4 it should be protected, 1700 but the NEC reference implementation does not. The option 1701 --socks5-gssapi-nec allows the unprotected exchange of the pro- 1702 tection mode negotiation. (Added in 7.19.4). 1703 1704 --stderr <file> 1705 Redirect all writes to stderr to the specified file instead. If 1706 the file name is a plain '-', it is instead written to stdout. 1707 1708 If this option is used several times, the last one will be used. 1709 1710 -t, --telnet-option <OPT=val> 1711 Pass options to the telnet protocol. Supported options are: 1712 1713 TTYPE=<term> Sets the terminal type. 1714 1715 XDISPLOC=<X display> Sets the X display location. 1716 1717 NEW_ENV=<var,val> Sets an environment variable. 1718 1719 -T, --upload-file <file> 1720 This transfers the specified local file to the remote URL. If 1721 there is no file part in the specified URL, Curl will append the 1722 local file name. NOTE that you must use a trailing / on the last 1723 directory to really prove to Curl that there is no file name or 1724 curl will think that your last directory name is the remote file 1725 name to use. That will most likely cause the upload operation to 1726 fail. If this is used on an HTTP(S) server, the PUT command will 1727 be used. 1728 1729 Use the file name "-" (a single dash) to use stdin instead of a 1730 given file. Alternately, the file name "." (a single period) 1731 may be specified instead of "-" to use stdin in non-blocking 1732 mode to allow reading server output while stdin is being 1733 uploaded. 1734 1735 You can specify one -T for each URL on the command line. Each -T 1736 + URL pair specifies what to upload and to where. curl also sup- 1737 ports "globbing" of the -T argument, meaning that you can upload 1738 multiple files to a single URL by using the same URL globbing 1739 style supported in the URL, like this: 1740 1741 curl -T "{file1,file2}" http://www.uploadtothissite.com 1742 1743 or even 1744 1745 curl -T "img[1-1000].png" ftp://ftp.picturemania.com/upload/ 1746 1747 --tcp-nodelay 1748 Turn on the TCP_NODELAY option. See the curl_easy_setopt(3) man 1749 page for details about this option. (Added in 7.11.2) 1750 1751 --tftp-blksize <value> 1752 (TFTP) Set TFTP BLKSIZE option (must be >512). This is the block 1753 size that curl will try to use when transferring data to or from 1754 a TFTP server. By default 512 bytes will be used. 1755 1756 If this option is used several times, the last one will be used. 1757 1758 (Added in 7.20.0) 1759 1760 --tlsauthtype <authtype> 1761 Set TLS authentication type. Currently, the only supported 1762 option is "SRP", for TLS-SRP (RFC 5054). If --tlsuser and 1763 --tlspassword are specified but --tlsauthtype is not, then this 1764 option defaults to "SRP". (Added in 7.21.4) 1765 1766 --tlspassword <password> 1767 Set password for use with the TLS authentication method speci- 1768 fied with --tlsauthtype. Requires that --tlsuser also be set. 1769 (Added in 7.21.4) 1770 1771 --tlsuser <user> 1772 Set username for use with the TLS authentication method speci- 1773 fied with --tlsauthtype. Requires that --tlspassword also be 1774 set. (Added in 7.21.4) 1775 1776 --tlsv1.0 1777 (SSL) Forces curl to use TLS version 1.0 when negotiating with a 1778 remote TLS server. (Added in 7.34.0) 1779 1780 --tlsv1.1 1781 (SSL) Forces curl to use TLS version 1.1 when negotiating with a 1782 remote TLS server. (Added in 7.34.0) 1783 1784 --tlsv1.2 1785 (SSL) Forces curl to use TLS version 1.2 when negotiating with a 1786 remote TLS server. (Added in 7.34.0) 1787 1788 --tr-encoding 1789 (HTTP) Request a compressed Transfer-Encoding response using one 1790 of the algorithms curl supports, and uncompress the data while 1791 receiving it. 1792 1793 (Added in 7.21.6) 1794 1795 --trace <file> 1796 Enables a full trace dump of all incoming and outgoing data, 1797 including descriptive information, to the given output file. Use 1798 "-" as filename to have the output sent to stdout. 1799 1800 This option overrides previous uses of -v, --verbose or --trace- 1801 ascii. 1802 1803 If this option is used several times, the last one will be used. 1804 1805 --trace-ascii <file> 1806 Enables a full trace dump of all incoming and outgoing data, 1807 including descriptive information, to the given output file. Use 1808 "-" as filename to have the output sent to stdout. 1809 1810 This is very similar to --trace, but leaves out the hex part and 1811 only shows the ASCII part of the dump. It makes smaller output 1812 that might be easier to read for untrained humans. 1813 1814 This option overrides previous uses of -v, --verbose or --trace. 1815 If this option is used several times, the last one will be used. 1816 1817 --trace-time 1818 Prepends a time stamp to each trace or verbose line that curl 1819 displays. (Added in 7.14.0) 1820 1821 -u, --user <user:password;options> 1822 Specify the user name, password and optional login options to 1823 use for server authentication. Overrides -n, --netrc and 1824 --netrc-optional. 1825 1826 If you simply specify the user name, with or without the login 1827 options, curl will prompt for a password. 1828 1829 If you use an SSPI-enabled curl binary and perform NTLM authen- 1830 tication, you can force curl to select the user name and pass- 1831 word from your environment by simply specifying a single colon 1832 with this option: "-u :" or by specfying the login options on 1833 their own, for example "-u ;auth=NTLM". 1834 1835 You can use the optional login options part to specify protocol 1836 specific options that may be used during authentication. At 1837 present only IMAP, POP3 and SMTP support login options as part 1838 of the user login information. For more information about the 1839 login options please see RFC 2384, RFC 5092 and IETF draft 1840 draft-earhart-url-smtp-00.txt (Added in 7.31.0). 1841 1842 If this option is used several times, the last one will be used. 1843 1844 -U, --proxy-user <user:password> 1845 Specify the user name and password to use for proxy authentica- 1846 tion. 1847 1848 If you use an SSPI-enabled curl binary and do NTLM authentica- 1849 tion, you can force curl to pick up the user name and password 1850 from your environment by simply specifying a single colon with 1851 this option: "-U :". 1852 1853 If this option is used several times, the last one will be used. 1854 1855 --url <URL> 1856 Specify a URL to fetch. This option is mostly handy when you 1857 want to specify URL(s) in a config file. 1858 1859 This option may be used any number of times. To control where 1860 this URL is written, use the -o, --output or the -O, --remote- 1861 name options. 1862 -v, --verbose 1863 Makes the fetching more verbose/talkative. Mostly useful for 1864 debugging. A line starting with '>' means "header data" sent by 1865 curl, '<' means "header data" received by curl that is hidden in 1866 normal cases, and a line starting with '*' means additional info 1867 provided by curl. 1868 1869 Note that if you only want HTTP headers in the output, -i, 1870 --include might be the option you're looking for. 1871 1872 If you think this option still doesn't give you enough details, 1873 consider using --trace or --trace-ascii instead. 1874 1875 This option overrides previous uses of --trace-ascii or --trace. 1876 1877 Use -s, --silent to make curl quiet. 1878 1879 -w, --write-out <format> 1880 Defines what to display on stdout after a completed and success- 1881 ful operation. The format is a string that may contain plain 1882 text mixed with any number of variables. The string can be spec- 1883 ified as "string", to get read from a particular file you spec- 1884 ify it "@filename" and to tell curl to read the format from 1885 stdin you write "@-". 1886 1887 The variables present in the output format will be substituted 1888 by the value or text that curl thinks fit, as described below. 1889 All variables are specified as %{variable_name} and to output a 1890 normal % you just write them as %%. You can output a newline by 1891 using \n, a carriage return with \r and a tab space with \t. 1892 1893 NOTE: The %-symbol is a special symbol in the win32-environment, 1894 where all occurrences of % must be doubled when using this 1895 option. 1896 1897 The variables available are: 1898 1899 content_type The Content-Type of the requested document, if 1900 there was any. 1901 1902 filename_effective 1903 The ultimate filename that curl writes out to. 1904 This is only meaningful if curl is told to write 1905 to a file with the --remote-name or --output 1906 option. It's most useful in combination with the 1907 --remote-header-name option. (Added in 7.25.1) 1908 1909 ftp_entry_path The initial path curl ended up in when logging on 1910 to the remote FTP server. (Added in 7.15.4) 1911 1912 http_code The numerical response code that was found in the 1913 last retrieved HTTP(S) or FTP(s) transfer. In 1914 7.18.2 the alias response_code was added to show 1915 the same info. 1916 1917 http_connect The numerical code that was found in the last 1918 response (from a proxy) to a curl CONNECT 1919 request. (Added in 7.12.4) 1920 1921 local_ip The IP address of the local end of the most 1922 recently done connection - can be either IPv4 or 1923 IPv6 (Added in 7.29.0) 1924 1925 local_port The local port number of the most recently done 1926 connection (Added in 7.29.0) 1927 1928 num_connects Number of new connects made in the recent trans- 1929 fer. (Added in 7.12.3) 1930 1931 num_redirects Number of redirects that were followed in the 1932 request. (Added in 7.12.3) 1933 1934 redirect_url When an HTTP request was made without -L to fol- 1935 low redirects, this variable will show the actual 1936 URL a redirect would take you to. (Added in 1937 7.18.2) 1938 1939 remote_ip The remote IP address of the most recently done 1940 connection - can be either IPv4 or IPv6 (Added in 1941 7.29.0) 1942 1943 remote_port The remote port number of the most recently done 1944 connection (Added in 7.29.0) 1945 1946 size_download The total amount of bytes that were downloaded. 1947 1948 size_header The total amount of bytes of the downloaded head- 1949 ers. 1950 1951 size_request The total amount of bytes that were sent in the 1952 HTTP request. 1953 1954 size_upload The total amount of bytes that were uploaded. 1955 1956 speed_download The average download speed that curl measured for 1957 the complete download. Bytes per second. 1958 1959 speed_upload The average upload speed that curl measured for 1960 the complete upload. Bytes per second. 1961 1962 ssl_verify_result 1963 The result of the SSL peer certificate verifica- 1964 tion that was requested. 0 means the verification 1965 was successful. (Added in 7.19.0) 1966 1967 time_appconnect 1968 The time, in seconds, it took from the start 1969 until the SSL/SSH/etc connect/handshake to the 1970 remote host was completed. (Added in 7.19.0) 1971 1972 time_connect The time, in seconds, it took from the start 1973 until the TCP connect to the remote host (or 1974 proxy) was completed. 1975 1976 time_namelookup 1977 The time, in seconds, it took from the start 1978 until the name resolving was completed. 1979 1980 time_pretransfer 1981 The time, in seconds, it took from the start 1982 until the file transfer was just about to begin. 1983 This includes all pre-transfer commands and nego- 1984 tiations that are specific to the particular pro- 1985 tocol(s) involved. 1986 1987 time_redirect The time, in seconds, it took for all redirection 1988 steps include name lookup, connect, pretransfer 1989 and transfer before the final transaction was 1990 started. time_redirect shows the complete execu- 1991 tion time for multiple redirections. (Added in 1992 7.12.3) 1993 1994 time_starttransfer 1995 The time, in seconds, it took from the start 1996 until the first byte was just about to be trans- 1997 ferred. This includes time_pretransfer and also 1998 the time the server needed to calculate the 1999 result. 2000 2001 time_total The total time, in seconds, that the full opera- 2002 tion lasted. The time will be displayed with mil- 2003 lisecond resolution. 2004 2005 url_effective The URL that was fetched last. This is most mean- 2006 ingful if you've told curl to follow location: 2007 headers. 2008 2009 If this option is used several times, the last one will be used. 2010 2011 -x, --proxy <[protocol://][user:password@]proxyhost[:port]> 2012 Use the specified proxy. 2013 2014 The proxy string can be specified with a protocol:// prefix to 2015 specify alternative proxy protocols. Use socks4://, socks4a://, 2016 socks5:// or socks5h:// to request the specific SOCKS version to 2017 be used. No protocol specified, http:// and all others will be 2018 treated as HTTP proxies. (The protocol support was added in curl 2019 7.21.7) 2020 2021 If the port number is not specified in the proxy string, it is 2022 assumed to be 1080. 2023 2024 This option overrides existing environment variables that set 2025 the proxy to use. If there's an environment variable setting a 2026 proxy, you can set proxy to "" to override it. 2027 2028 All operations that are performed over an HTTP proxy will trans- 2029 parently be converted to HTTP. It means that certain protocol 2030 specific operations might not be available. This is not the case 2031 if you can tunnel through the proxy, as one with the -p, --prox- 2032 ytunnel option. 2033 2034 User and password that might be provided in the proxy string are 2035 URL decoded by curl. This allows you to pass in special charac- 2036 ters such as @ by using %40 or pass in a colon with %3a. 2037 2038 The proxy host can be specified the exact same way as the proxy 2039 environment variables, including the protocol prefix (http://) 2040 and the embedded user + password. 2041 2042 If this option is used several times, the last one will be used. 2043 2044 -X, --request <command> 2045 (HTTP) Specifies a custom request method to use when communicat- 2046 ing with the HTTP server. The specified request will be used 2047 instead of the method otherwise used (which defaults to GET). 2048 Read the HTTP 1.1 specification for details and explanations. 2049 Common additional HTTP requests include PUT and DELETE, but 2050 related technologies like WebDAV offers PROPFIND, COPY, MOVE and 2051 more. 2052 2053 Normally you don't need this option. All sorts of GET, HEAD, 2054 POST and PUT requests are rather invoked by using dedicated com- 2055 mand line options. 2056 2057 This option only changes the actual word used in the HTTP 2058 request, it does not alter the way curl behaves. So for example 2059 if you want to make a proper HEAD request, using -X HEAD will 2060 not suffice. You need to use the -I, --head option. 2061 2062 (FTP) Specifies a custom FTP command to use instead of LIST when 2063 doing file lists with FTP. 2064 2065 (POP3) Specifies a custom POP3 command to use instead of LIST or 2066 RETR. (Added in 7.26.0) 2067 2068 (IMAP) Specifies a custom IMAP command to use insead of LIST. 2069 (Added in 7.30.0) 2070 2071 (SMTP) Specifies a custom SMTP command to use instead of HELP or 2072 VRFY. (Added in 7.34.0) 2073 2074 If this option is used several times, the last one will be used. 2075 2076 --xattr 2077 When saving output to a file, this option tells curl to store 2078 certain file metadata in extended file attributes. Currently, 2079 the URL is stored in the xdg.origin.url attribute and, for HTTP, 2080 the content type is stored in the mime_type attribute. If the 2081 file system does not support extended attributes, a warning is 2082 issued. 2083 2084 -y, --speed-time <time> 2085 If a download is slower than speed-limit bytes per second during 2086 a speed-time period, the download gets aborted. If speed-time is 2087 used, the default speed-limit will be 1 unless set with -Y. 2088 2089 This option controls transfers and thus will not affect slow 2090 connects etc. If this is a concern for you, try the --connect- 2091 timeout option. 2092 2093 If this option is used several times, the last one will be used. 2094 2095 -Y, --speed-limit <speed> 2096 If a download is slower than this given speed (in bytes per sec- 2097 ond) for speed-time seconds it gets aborted. speed-time is set 2098 with -y and is 30 if not set. 2099 2100 If this option is used several times, the last one will be used. 2101 2102 -z, --time-cond <date expression>|<file> 2103 (HTTP/FTP) Request a file that has been modified later than the 2104 given time and date, or one that has been modified before that 2105 time. The <date expression> can be all sorts of date strings or 2106 if it doesn't match any internal ones, it is taken as a filename 2107 and tries to get the modification date (mtime) from <file> 2108 instead. See the curl_getdate(3) man pages for date expression 2109 details. 2110 2111 Start the date expression with a dash (-) to make it request for 2112 a document that is older than the given date/time, default is a 2113 document that is newer than the specified date/time. 2114 2115 If this option is used several times, the last one will be used. 2116 2117 -h, --help 2118 Usage help. 2119 2120 -M, --manual 2121 Manual. Display the huge help text. 2122 2123 -V, --version 2124 Displays information about curl and the libcurl version it uses. 2125 The first line includes the full version of curl, libcurl and 2126 other 3rd party libraries linked with the executable. 2127 2128 The second line (starts with "Protocols:") shows all protocols 2129 that libcurl reports to support. 2130 2131 The third line (starts with "Features:") shows specific features 2132 libcurl reports to offer. Available features include: 2133 2134 IPv6 You can use IPv6 with this. 2135 2136 krb4 Krb4 for FTP is supported. 2137 2138 SSL HTTPS and FTPS are supported. 2139 2140 libz Automatic decompression of compressed files over HTTP is 2141 supported. 2142 2143 NTLM NTLM authentication is supported. 2144 2145 GSS-Negotiate 2146 Negotiate authentication and krb5 for FTP is supported. 2147 2148 Debug This curl uses a libcurl built with Debug. This enables 2149 more error-tracking and memory debugging etc. For curl- 2150 developers only! 2151 2152 AsynchDNS 2153 This curl uses asynchronous name resolves. 2154 2155 SPNEGO SPNEGO Negotiate authentication is supported. 2156 2157 Largefile 2158 This curl supports transfers of large files, files larger 2159 than 2GB. 2160 2161 IDN This curl supports IDN - international domain names. 2162 2163 SSPI SSPI is supported. If you use NTLM and set a blank user 2164 name, curl will authenticate with your current user and 2165 password. 2166 2167 TLS-SRP 2168 SRP (Secure Remote Password) authentication is supported 2169 for TLS. 2170 Metalink 2171 This curl supports Metalink (both version 3 and 4 (RFC 2172 5854)), which describes mirrors and hashes. curl will 2173 use mirrors for failover if there are errors (such as the 2174 file or server not being available). 2175 2176FILES 2177 ~/.curlrc 2178 Default config file, see -K, --config for details. 2179 2180ENVIRONMENT 2181 The environment variables can be specified in lower case or upper case. 2182 The lower case version has precedence. http_proxy is an exception as it 2183 is only available in lower case. 2184 2185 Using an environment variable to set the proxy has the same effect as 2186 using the --proxy option. 2187 2188 http_proxy [protocol://]<host>[:port] 2189 Sets the proxy server to use for HTTP. 2190 HTTPS_PROXY [protocol://]<host>[:port] 2191 Sets the proxy server to use for HTTPS. 2192 2193 [url-protocol]_PROXY [protocol://]<host>[:port] 2194 Sets the proxy server to use for [url-protocol], where the pro- 2195 tocol is a protocol that curl supports and as specified in a 2196 URL. FTP, FTPS, POP3, IMAP, SMTP, LDAP etc. 2197 2198 ALL_PROXY [protocol://]<host>[:port] 2199 Sets the proxy server to use if no protocol-specific proxy is 2200 set. 2201 2202 NO_PROXY <comma-separated list of hosts> 2203 list of host names that shouldn't go through any proxy. If set 2204 to a asterisk '*' only, it matches all hosts. 2205 2206PROXY PROTOCOL PREFIXES 2207 Since curl version 7.21.7, the proxy string may be specified with a 2208 protocol:// prefix to specify alternative proxy protocols. 2209 2210 If no protocol is specified in the proxy string or if the string 2211 doesn't match a supported one, the proxy will be treated as an HTTP 2212 proxy. 2213 2214 The supported proxy protocol prefixes are as follows: 2215 2216 socks4:// 2217 Makes it the equivalent of --socks4 2218 2219 socks4a:// 2220 Makes it the equivalent of --socks4a 2221 2222 socks5:// 2223 Makes it the equivalent of --socks5 2224 2225 socks5h:// 2226 Makes it the equivalent of --socks5-hostname 2227 2228EXIT CODES 2229 There are a bunch of different error codes and their corresponding 2230 error messages that may appear during bad conditions. At the time of 2231 this writing, the exit codes are: 2232 2233 1 Unsupported protocol. This build of curl has no support for this 2234 protocol. 2235 2236 2 Failed to initialize. 2237 2238 3 URL malformed. The syntax was not correct. 2239 2240 4 A feature or option that was needed to perform the desired 2241 request was not enabled or was explicitly disabled at build- 2242 time. To make curl able to do this, you probably need another 2243 build of libcurl! 2244 2245 5 Couldn't resolve proxy. The given proxy host could not be 2246 resolved. 2247 2248 6 Couldn't resolve host. The given remote host was not resolved. 2249 2250 7 Failed to connect to host. 2251 2252 8 FTP weird server reply. The server sent data curl couldn't 2253 parse. 2254 2255 9 FTP access denied. The server denied login or denied access to 2256 the particular resource or directory you wanted to reach. Most 2257 often you tried to change to a directory that doesn't exist on 2258 the server. 2259 2260 11 FTP weird PASS reply. Curl couldn't parse the reply sent to the 2261 PASS request. 2262 2263 13 FTP weird PASV reply, Curl couldn't parse the reply sent to the 2264 PASV request. 2265 2266 14 FTP weird 227 format. Curl couldn't parse the 227-line the 2267 server sent. 2268 2269 15 FTP can't get host. Couldn't resolve the host IP we got in the 2270 227-line. 2271 2272 17 FTP couldn't set binary. Couldn't change transfer method to 2273 binary. 2274 2275 18 Partial file. Only a part of the file was transferred. 2276 2277 19 FTP couldn't download/access the given file, the RETR (or simi- 2278 lar) command failed. 2279 2280 21 FTP quote error. A quote command returned error from the server. 2281 22 HTTP page not retrieved. The requested url was not found or 2282 returned another error with the HTTP error code being 400 or 2283 above. This return code only appears if -f, --fail is used. 2284 2285 23 Write error. Curl couldn't write data to a local filesystem or 2286 similar. 2287 2288 25 FTP couldn't STOR file. The server denied the STOR operation, 2289 used for FTP uploading. 2290 2291 26 Read error. Various reading problems. 2292 2293 27 Out of memory. A memory allocation request failed. 2294 2295 28 Operation timeout. The specified time-out period was reached 2296 according to the conditions. 2297 2298 30 FTP PORT failed. The PORT command failed. Not all FTP servers 2299 support the PORT command, try doing a transfer using PASV 2300 instead! 2301 2302 31 FTP couldn't use REST. The REST command failed. This command is 2303 used for resumed FTP transfers. 2304 2305 33 HTTP range error. The range "command" didn't work. 2306 2307 34 HTTP post error. Internal post-request generation error. 2308 2309 35 SSL connect error. The SSL handshaking failed. 2310 2311 36 FTP bad download resume. Couldn't continue an earlier aborted 2312 download. 2313 2314 37 FILE couldn't read file. Failed to open the file. Permissions? 2315 2316 38 LDAP cannot bind. LDAP bind operation failed. 2317 2318 39 LDAP search failed. 2319 2320 41 Function not found. A required LDAP function was not found. 2321 2322 42 Aborted by callback. An application told curl to abort the oper- 2323 ation. 2324 2325 43 Internal error. A function was called with a bad parameter. 2326 2327 45 Interface error. A specified outgoing interface could not be 2328 used. 2329 2330 47 Too many redirects. When following redirects, curl hit the maxi- 2331 mum amount. 2332 2333 48 Unknown option specified to libcurl. This indicates that you 2334 passed a weird option to curl that was passed on to libcurl and 2335 rejected. Read up in the manual! 2336 2337 49 Malformed telnet option. 2338 2339 51 The peer's SSL certificate or SSH MD5 fingerprint was not OK. 2340 2341 52 The server didn't reply anything, which here is considered an 2342 error. 2343 2344 53 SSL crypto engine not found. 2345 2346 54 Cannot set SSL crypto engine as default. 2347 2348 55 Failed sending network data. 2349 2350 56 Failure in receiving network data. 2351 2352 58 Problem with the local certificate. 2353 2354 59 Couldn't use specified SSL cipher. 2355 2356 60 Peer certificate cannot be authenticated with known CA certifi- 2357 cates. 2358 2359 61 Unrecognized transfer encoding. 2360 2361 62 Invalid LDAP URL. 2362 2363 63 Maximum file size exceeded. 2364 2365 64 Requested FTP SSL level failed. 2366 2367 65 Sending the data requires a rewind that failed. 2368 2369 66 Failed to initialise SSL Engine. 2370 2371 67 The user name, password, or similar was not accepted and curl 2372 failed to log in. 2373 2374 68 File not found on TFTP server. 2375 2376 69 Permission problem on TFTP server. 2377 2378 70 Out of disk space on TFTP server. 2379 2380 71 Illegal TFTP operation. 2381 2382 72 Unknown TFTP transfer ID. 2383 2384 73 File already exists (TFTP). 2385 2386 74 No such user (TFTP). 2387 2388 75 Character conversion failed. 2389 2390 76 Character conversion functions required. 2391 2392 77 Problem with reading the SSL CA cert (path? access rights?). 2393 2394 78 The resource referenced in the URL does not exist. 2395 2396 79 An unspecified error occurred during the SSH session. 2397 2398 80 Failed to shut down the SSL connection. 2399 2400 82 Could not load CRL file, missing or wrong format (added in 2401 7.19.0). 2402 2403 83 Issuer check failed (added in 7.19.0). 2404 2405 84 The FTP PRET command failed 2406 2407 85 RTSP: mismatch of CSeq numbers 2408 2409 86 RTSP: mismatch of Session Identifiers 2410 2411 87 unable to parse FTP file list 2412 2413 88 FTP chunk callback reported error 2414 2415 89 No connection available, the session will be queued 2416 2417 XX More error codes will appear here in future releases. The exist- 2418 ing ones are meant to never change. 2419 2420AUTHORS / CONTRIBUTORS 2421 Daniel Stenberg is the main author, but the whole list of contributors 2422 is found in the separate THANKS file. 2423 2424WWW 2425 http://curl.haxx.se 2426 2427FTP 2428 ftp://ftp.sunet.se/pub/www/utilities/curl/ 2429 2430SEE ALSO 2431 ftp(1), wget(1) 2432 2433LATEST VERSION 2434 2435 You always find news about what's going on as well as the latest versions 2436 from the curl web pages, located at: 2437 2438 http://curl.haxx.se 2439 2440SIMPLE USAGE 2441 2442 Get the main page from Netscape's web-server: 2443 2444 curl http://www.netscape.com/ 2445 2446 Get the README file the user's home directory at funet's ftp-server: 2447 2448 curl ftp://ftp.funet.fi/README 2449 2450 Get a web page from a server using port 8000: 2451 2452 curl http://www.weirdserver.com:8000/ 2453 2454 Get a directory listing of an FTP site: 2455 2456 curl ftp://cool.haxx.se/ 2457 2458 Get the definition of curl from a dictionary: 2459 2460 curl dict://dict.org/m:curl 2461 2462 Fetch two documents at once: 2463 2464 curl ftp://cool.haxx.se/ http://www.weirdserver.com:8000/ 2465 2466 Get a file off an FTPS server: 2467 2468 curl ftps://files.are.secure.com/secrets.txt 2469 2470 or use the more appropriate FTPS way to get the same file: 2471 2472 curl --ftp-ssl ftp://files.are.secure.com/secrets.txt 2473 2474 Get a file from an SSH server using SFTP: 2475 2476 curl -u username sftp://shell.example.com/etc/issue 2477 2478 Get a file from an SSH server using SCP using a private key to authenticate: 2479 2480 curl -u username: --key ~/.ssh/id_dsa --pubkey ~/.ssh/id_dsa.pub \ 2481 scp://shell.example.com/~/personal.txt 2482 2483 Get the main page from an IPv6 web server: 2484 2485 curl -g "http://[2001:1890:1112:1::20]/" 2486 2487DOWNLOAD TO A FILE 2488 2489 Get a web page and store in a local file with a specific name: 2490 2491 curl -o thatpage.html http://www.netscape.com/ 2492 2493 Get a web page and store in a local file, make the local file get the name 2494 of the remote document (if no file name part is specified in the URL, this 2495 will fail): 2496 2497 curl -O http://www.netscape.com/index.html 2498 2499 Fetch two files and store them with their remote names: 2500 2501 curl -O www.haxx.se/index.html -O curl.haxx.se/download.html 2502 2503USING PASSWORDS 2504 2505 FTP 2506 2507 To ftp files using name+passwd, include them in the URL like: 2508 2509 curl ftp://name:passwd@machine.domain:port/full/path/to/file 2510 2511 or specify them with the -u flag like 2512 2513 curl -u name:passwd ftp://machine.domain:port/full/path/to/file 2514 2515 FTPS 2516 2517 It is just like for FTP, but you may also want to specify and use 2518 SSL-specific options for certificates etc. 2519 2520 Note that using FTPS:// as prefix is the "implicit" way as described in the 2521 standards while the recommended "explicit" way is done by using FTP:// and 2522 the --ftp-ssl option. 2523 2524 SFTP / SCP 2525 2526 This is similar to FTP, but you can specify a private key to use instead of 2527 a password. Note that the private key may itself be protected by a password 2528 that is unrelated to the login password of the remote system. If you 2529 provide a private key file you must also provide a public key file. 2530 2531 HTTP 2532 2533 Curl also supports user and password in HTTP URLs, thus you can pick a file 2534 like: 2535 2536 curl http://name:passwd@machine.domain/full/path/to/file 2537 2538 or specify user and password separately like in 2539 2540 curl -u name:passwd http://machine.domain/full/path/to/file 2541 2542 HTTP offers many different methods of authentication and curl supports 2543 several: Basic, Digest, NTLM and Negotiate. Without telling which method to 2544 use, curl defaults to Basic. You can also ask curl to pick the most secure 2545 ones out of the ones that the server accepts for the given URL, by using 2546 --anyauth. 2547 2548 NOTE! According to the URL specification, HTTP URLs can not contain a user 2549 and password, so that style will not work when using curl via a proxy, even 2550 though curl allows it at other times. When using a proxy, you _must_ use 2551 the -u style for user and password. 2552 2553 HTTPS 2554 2555 Probably most commonly used with private certificates, as explained below. 2556 2557PROXY 2558 2559 curl supports both HTTP and SOCKS proxy servers, with optional authentication. 2560 It does not have special support for FTP proxy servers since there are no 2561 standards for those, but it can still be made to work with many of them. You 2562 can also use both HTTP and SOCKS proxies to transfer files to and from FTP 2563 servers. 2564 2565 Get an ftp file using an HTTP proxy named my-proxy that uses port 888: 2566 2567 curl -x my-proxy:888 ftp://ftp.leachsite.com/README 2568 2569 Get a file from an HTTP server that requires user and password, using the 2570 same proxy as above: 2571 2572 curl -u user:passwd -x my-proxy:888 http://www.get.this/ 2573 2574 Some proxies require special authentication. Specify by using -U as above: 2575 2576 curl -U user:passwd -x my-proxy:888 http://www.get.this/ 2577 2578 A comma-separated list of hosts and domains which do not use the proxy can 2579 be specified as: 2580 2581 curl --noproxy localhost,get.this -x my-proxy:888 http://www.get.this/ 2582 2583 If the proxy is specified with --proxy1.0 instead of --proxy or -x, then 2584 curl will use HTTP/1.0 instead of HTTP/1.1 for any CONNECT attempts. 2585 2586 curl also supports SOCKS4 and SOCKS5 proxies with --socks4 and --socks5. 2587 2588 See also the environment variables Curl supports that offer further proxy 2589 control. 2590 2591 Most FTP proxy servers are set up to appear as a normal FTP server from the 2592 client's perspective, with special commands to select the remote FTP server. 2593 curl supports the -u, -Q and --ftp-account options that can be used to 2594 set up transfers through many FTP proxies. For example, a file can be 2595 uploaded to a remote FTP server using a Blue Coat FTP proxy with the 2596 options: 2597 2598 curl -u "Remote-FTP-Username@remote.ftp.server Proxy-Username:Remote-Pass" \ 2599 --ftp-account Proxy-Password --upload-file local-file \ 2600 ftp://my-ftp.proxy.server:21/remote/upload/path/ 2601 2602 See the manual for your FTP proxy to determine the form it expects to set up 2603 transfers, and curl's -v option to see exactly what curl is sending. 2604 2605RANGES 2606 2607 HTTP 1.1 introduced byte-ranges. Using this, a client can request 2608 to get only one or more subparts of a specified document. Curl supports 2609 this with the -r flag. 2610 2611 Get the first 100 bytes of a document: 2612 2613 curl -r 0-99 http://www.get.this/ 2614 2615 Get the last 500 bytes of a document: 2616 2617 curl -r -500 http://www.get.this/ 2618 2619 Curl also supports simple ranges for FTP files as well. Then you can only 2620 specify start and stop position. 2621 2622 Get the first 100 bytes of a document using FTP: 2623 2624 curl -r 0-99 ftp://www.get.this/README 2625 2626UPLOADING 2627 2628 FTP / FTPS / SFTP / SCP 2629 2630 Upload all data on stdin to a specified server: 2631 2632 curl -T - ftp://ftp.upload.com/myfile 2633 2634 Upload data from a specified file, login with user and password: 2635 2636 curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile 2637 2638 Upload a local file to the remote site, and use the local file name at the remote 2639 site too: 2640 2641 curl -T uploadfile -u user:passwd ftp://ftp.upload.com/ 2642 2643 Upload a local file to get appended to the remote file: 2644 2645 curl -T localfile -a ftp://ftp.upload.com/remotefile 2646 2647 Curl also supports ftp upload through a proxy, but only if the proxy is 2648 configured to allow that kind of tunneling. If it does, you can run curl in 2649 a fashion similar to: 2650 2651 curl --proxytunnel -x proxy:port -T localfile ftp.upload.com 2652 2653 HTTP 2654 2655 Upload all data on stdin to a specified HTTP site: 2656 2657 curl -T - http://www.upload.com/myfile 2658 2659 Note that the HTTP server must have been configured to accept PUT before 2660 this can be done successfully. 2661 2662 For other ways to do HTTP data upload, see the POST section below. 2663 2664VERBOSE / DEBUG 2665 2666 If curl fails where it isn't supposed to, if the servers don't let you in, 2667 if you can't understand the responses: use the -v flag to get verbose 2668 fetching. Curl will output lots of info and what it sends and receives in 2669 order to let the user see all client-server interaction (but it won't show 2670 you the actual data). 2671 2672 curl -v ftp://ftp.upload.com/ 2673 2674 To get even more details and information on what curl does, try using the 2675 --trace or --trace-ascii options with a given file name to log to, like 2676 this: 2677 2678 curl --trace trace.txt www.haxx.se 2679 2680 2681DETAILED INFORMATION 2682 2683 Different protocols provide different ways of getting detailed information 2684 about specific files/documents. To get curl to show detailed information 2685 about a single file, you should use -I/--head option. It displays all 2686 available info on a single file for HTTP and FTP. The HTTP information is a 2687 lot more extensive. 2688 2689 For HTTP, you can get the header information (the same as -I would show) 2690 shown before the data by using -i/--include. Curl understands the 2691 -D/--dump-header option when getting files from both FTP and HTTP, and it 2692 will then store the headers in the specified file. 2693 2694 Store the HTTP headers in a separate file (headers.txt in the example): 2695 2696 curl --dump-header headers.txt curl.haxx.se 2697 2698 Note that headers stored in a separate file can be very useful at a later 2699 time if you want curl to use cookies sent by the server. More about that in 2700 the cookies section. 2701 2702POST (HTTP) 2703 2704 It's easy to post data using curl. This is done using the -d <data> 2705 option. The post data must be urlencoded. 2706 2707 Post a simple "name" and "phone" guestbook. 2708 2709 curl -d "name=Rafael%20Sagula&phone=3320780" \ 2710 http://www.where.com/guest.cgi 2711 2712 How to post a form with curl, lesson #1: 2713 2714 Dig out all the <input> tags in the form that you want to fill in. (There's 2715 a perl program called formfind.pl on the curl site that helps with this). 2716 2717 If there's a "normal" post, you use -d to post. -d takes a full "post 2718 string", which is in the format 2719 2720 <variable1>=<data1>&<variable2>=<data2>&... 2721 2722 The 'variable' names are the names set with "name=" in the <input> tags, and 2723 the data is the contents you want to fill in for the inputs. The data *must* 2724 be properly URL encoded. That means you replace space with + and that you 2725 replace weird letters with %XX where XX is the hexadecimal representation of 2726 the letter's ASCII code. 2727 2728 Example: 2729 2730 (page located at http://www.formpost.com/getthis/ 2731 2732 <form action="post.cgi" method="post"> 2733 <input name=user size=10> 2734 <input name=pass type=password size=10> 2735 <input name=id type=hidden value="blablabla"> 2736 <input name=ding value="submit"> 2737 </form> 2738 2739 We want to enter user 'foobar' with password '12345'. 2740 2741 To post to this, you enter a curl command line like: 2742 2743 curl -d "user=foobar&pass=12345&id=blablabla&ding=submit" (continues) 2744 http://www.formpost.com/getthis/post.cgi 2745 2746 2747 While -d uses the application/x-www-form-urlencoded mime-type, generally 2748 understood by CGI's and similar, curl also supports the more capable 2749 multipart/form-data type. This latter type supports things like file upload. 2750 2751 -F accepts parameters like -F "name=contents". If you want the contents to 2752 be read from a file, use <@filename> as contents. When specifying a file, 2753 you can also specify the file content type by appending ';type=<mime type>' 2754 to the file name. You can also post the contents of several files in one 2755 field. For example, the field name 'coolfiles' is used to send three files, 2756 with different content types using the following syntax: 2757 2758 curl -F "coolfiles=@fil1.gif;type=image/gif,fil2.txt,fil3.html" \ 2759 http://www.post.com/postit.cgi 2760 2761 If the content-type is not specified, curl will try to guess from the file 2762 extension (it only knows a few), or use the previously specified type (from 2763 an earlier file if several files are specified in a list) or else it will 2764 use the default type 'application/octet-stream'. 2765 2766 Emulate a fill-in form with -F. Let's say you fill in three fields in a 2767 form. One field is a file name which to post, one field is your name and one 2768 field is a file description. We want to post the file we have written named 2769 "cooltext.txt". To let curl do the posting of this data instead of your 2770 favourite browser, you have to read the HTML source of the form page and 2771 find the names of the input fields. In our example, the input field names 2772 are 'file', 'yourname' and 'filedescription'. 2773 2774 curl -F "file=@cooltext.txt" -F "yourname=Daniel" \ 2775 -F "filedescription=Cool text file with cool text inside" \ 2776 http://www.post.com/postit.cgi 2777 2778 To send two files in one post you can do it in two ways: 2779 2780 1. Send multiple files in a single "field" with a single field name: 2781 2782 curl -F "pictures=@dog.gif,cat.gif" 2783 2784 2. Send two fields with two field names: 2785 2786 curl -F "docpicture=@dog.gif" -F "catpicture=@cat.gif" 2787 2788 To send a field value literally without interpreting a leading '@' 2789 or '<', or an embedded ';type=', use --form-string instead of 2790 -F. This is recommended when the value is obtained from a user or 2791 some other unpredictable source. Under these circumstances, using 2792 -F instead of --form-string would allow a user to trick curl into 2793 uploading a file. 2794 2795REFERRER 2796 2797 An HTTP request has the option to include information about which address 2798 referred it to the actual page. Curl allows you to specify the 2799 referrer to be used on the command line. It is especially useful to 2800 fool or trick stupid servers or CGI scripts that rely on that information 2801 being available or contain certain data. 2802 2803 curl -e www.coolsite.com http://www.showme.com/ 2804 2805 NOTE: The Referer: [sic] field is defined in the HTTP spec to be a full URL. 2806 2807USER AGENT 2808 2809 An HTTP request has the option to include information about the browser 2810 that generated the request. Curl allows it to be specified on the command 2811 line. It is especially useful to fool or trick stupid servers or CGI 2812 scripts that only accept certain browsers. 2813 2814 Example: 2815 2816 curl -A 'Mozilla/3.0 (Win95; I)' http://www.nationsbank.com/ 2817 2818 Other common strings: 2819 'Mozilla/3.0 (Win95; I)' Netscape Version 3 for Windows 95 2820 'Mozilla/3.04 (Win95; U)' Netscape Version 3 for Windows 95 2821 'Mozilla/2.02 (OS/2; U)' Netscape Version 2 for OS/2 2822 'Mozilla/4.04 [en] (X11; U; AIX 4.2; Nav)' NS for AIX 2823 'Mozilla/4.05 [en] (X11; U; Linux 2.0.32 i586)' NS for Linux 2824 2825 Note that Internet Explorer tries hard to be compatible in every way: 2826 'Mozilla/4.0 (compatible; MSIE 4.01; Windows 95)' MSIE for W95 2827 2828 Mozilla is not the only possible User-Agent name: 2829 'Konqueror/1.0' KDE File Manager desktop client 2830 'Lynx/2.7.1 libwww-FM/2.14' Lynx command line browser 2831 2832COOKIES 2833 2834 Cookies are generally used by web servers to keep state information at the 2835 client's side. The server sets cookies by sending a response line in the 2836 headers that looks like 'Set-Cookie: <data>' where the data part then 2837 typically contains a set of NAME=VALUE pairs (separated by semicolons ';' 2838 like "NAME1=VALUE1; NAME2=VALUE2;"). The server can also specify for what 2839 path the "cookie" should be used for (by specifying "path=value"), when the 2840 cookie should expire ("expire=DATE"), for what domain to use it 2841 ("domain=NAME") and if it should be used on secure connections only 2842 ("secure"). 2843 2844 If you've received a page from a server that contains a header like: 2845 Set-Cookie: sessionid=boo123; path="/foo"; 2846 2847 it means the server wants that first pair passed on when we get anything in 2848 a path beginning with "/foo". 2849 2850 Example, get a page that wants my name passed in a cookie: 2851 2852 curl -b "name=Daniel" www.sillypage.com 2853 2854 Curl also has the ability to use previously received cookies in following 2855 sessions. If you get cookies from a server and store them in a file in a 2856 manner similar to: 2857 2858 curl --dump-header headers www.example.com 2859 2860 ... you can then in a second connect to that (or another) site, use the 2861 cookies from the 'headers' file like: 2862 2863 curl -b headers www.example.com 2864 2865 While saving headers to a file is a working way to store cookies, it is 2866 however error-prone and not the preferred way to do this. Instead, make curl 2867 save the incoming cookies using the well-known netscape cookie format like 2868 this: 2869 2870 curl -c cookies.txt www.example.com 2871 2872 Note that by specifying -b you enable the "cookie awareness" and with -L 2873 you can make curl follow a location: (which often is used in combination 2874 with cookies). So that if a site sends cookies and a location, you can 2875 use a non-existing file to trigger the cookie awareness like: 2876 2877 curl -L -b empty.txt www.example.com 2878 2879 The file to read cookies from must be formatted using plain HTTP headers OR 2880 as netscape's cookie file. Curl will determine what kind it is based on the 2881 file contents. In the above command, curl will parse the header and store 2882 the cookies received from www.example.com. curl will send to the server the 2883 stored cookies which match the request as it follows the location. The 2884 file "empty.txt" may be a nonexistent file. 2885 2886 Alas, to both read and write cookies from a netscape cookie file, you can 2887 set both -b and -c to use the same file: 2888 2889 curl -b cookies.txt -c cookies.txt www.example.com 2890 2891PROGRESS METER 2892 2893 The progress meter exists to show a user that something actually is 2894 happening. The different fields in the output have the following meaning: 2895 2896 % Total % Received % Xferd Average Speed Time Curr. 2897 Dload Upload Total Current Left Speed 2898 0 151M 0 38608 0 0 9406 0 4:41:43 0:00:04 4:41:39 9287 2899 2900 From left-to-right: 2901 % - percentage completed of the whole transfer 2902 Total - total size of the whole expected transfer 2903 % - percentage completed of the download 2904 Received - currently downloaded amount of bytes 2905 % - percentage completed of the upload 2906 Xferd - currently uploaded amount of bytes 2907 Average Speed 2908 Dload - the average transfer speed of the download 2909 Average Speed 2910 Upload - the average transfer speed of the upload 2911 Time Total - expected time to complete the operation 2912 Time Current - time passed since the invoke 2913 Time Left - expected time left to completion 2914 Curr.Speed - the average transfer speed the last 5 seconds (the first 2915 5 seconds of a transfer is based on less time of course.) 2916 2917 The -# option will display a totally different progress bar that doesn't 2918 need much explanation! 2919 2920SPEED LIMIT 2921 2922 Curl allows the user to set the transfer speed conditions that must be met 2923 to let the transfer keep going. By using the switch -y and -Y you 2924 can make curl abort transfers if the transfer speed is below the specified 2925 lowest limit for a specified time. 2926 2927 To have curl abort the download if the speed is slower than 3000 bytes per 2928 second for 1 minute, run: 2929 2930 curl -Y 3000 -y 60 www.far-away-site.com 2931 2932 This can very well be used in combination with the overall time limit, so 2933 that the above operation must be completed in whole within 30 minutes: 2934 2935 curl -m 1800 -Y 3000 -y 60 www.far-away-site.com 2936 2937 Forcing curl not to transfer data faster than a given rate is also possible, 2938 which might be useful if you're using a limited bandwidth connection and you 2939 don't want your transfer to use all of it (sometimes referred to as 2940 "bandwidth throttle"). 2941 2942 Make curl transfer data no faster than 10 kilobytes per second: 2943 2944 curl --limit-rate 10K www.far-away-site.com 2945 2946 or 2947 2948 curl --limit-rate 10240 www.far-away-site.com 2949 2950 Or prevent curl from uploading data faster than 1 megabyte per second: 2951 2952 curl -T upload --limit-rate 1M ftp://uploadshereplease.com 2953 2954 When using the --limit-rate option, the transfer rate is regulated on a 2955 per-second basis, which will cause the total transfer speed to become lower 2956 than the given number. Sometimes of course substantially lower, if your 2957 transfer stalls during periods. 2958 2959CONFIG FILE 2960 2961 Curl automatically tries to read the .curlrc file (or _curlrc file on win32 2962 systems) from the user's home dir on startup. 2963 2964 The config file could be made up with normal command line switches, but you 2965 can also specify the long options without the dashes to make it more 2966 readable. You can separate the options and the parameter with spaces, or 2967 with = or :. Comments can be used within the file. If the first letter on a 2968 line is a '#'-symbol the rest of the line is treated as a comment. 2969 2970 If you want the parameter to contain spaces, you must enclose the entire 2971 parameter within double quotes ("). Within those quotes, you specify a 2972 quote as \". 2973 2974 NOTE: You must specify options and their arguments on the same line. 2975 2976 Example, set default time out and proxy in a config file: 2977 2978 # We want a 30 minute timeout: 2979 -m 1800 2980 # ... and we use a proxy for all accesses: 2981 proxy = proxy.our.domain.com:8080 2982 2983 White spaces ARE significant at the end of lines, but all white spaces 2984 leading up to the first characters of each line are ignored. 2985 2986 Prevent curl from reading the default file by using -q as the first command 2987 line parameter, like: 2988 2989 curl -q www.thatsite.com 2990 2991 Force curl to get and display a local help page in case it is invoked 2992 without URL by making a config file similar to: 2993 2994 # default url to get 2995 url = "http://help.with.curl.com/curlhelp.html" 2996 2997 You can specify another config file to be read by using the -K/--config 2998 flag. If you set config file name to "-" it'll read the config from stdin, 2999 which can be handy if you want to hide options from being visible in process 3000 tables etc: 3001 3002 echo "user = user:passwd" | curl -K - http://that.secret.site.com 3003 3004EXTRA HEADERS 3005 3006 When using curl in your own very special programs, you may end up needing 3007 to pass on your own custom headers when getting a web page. You can do 3008 this by using the -H flag. 3009 3010 Example, send the header "X-you-and-me: yes" to the server when getting a 3011 page: 3012 3013 curl -H "X-you-and-me: yes" www.love.com 3014 3015 This can also be useful in case you want curl to send a different text in a 3016 header than it normally does. The -H header you specify then replaces the 3017 header curl would normally send. If you replace an internal header with an 3018 empty one, you prevent that header from being sent. To prevent the Host: 3019 header from being used: 3020 3021 curl -H "Host:" www.server.com 3022 3023FTP and PATH NAMES 3024 3025 Do note that when getting files with the ftp:// URL, the given path is 3026 relative the directory you enter. To get the file 'README' from your home 3027 directory at your ftp site, do: 3028 3029 curl ftp://user:passwd@my.site.com/README 3030 3031 But if you want the README file from the root directory of that very same 3032 site, you need to specify the absolute file name: 3033 3034 curl ftp://user:passwd@my.site.com//README 3035 3036 (I.e with an extra slash in front of the file name.) 3037 3038SFTP and SCP and PATH NAMES 3039 3040 With sftp: and scp: URLs, the path name given is the absolute name on the 3041 server. To access a file relative to the remote user's home directory, 3042 prefix the file with /~/ , such as: 3043 3044 curl -u $USER sftp://home.example.com/~/.bashrc 3045 3046FTP and firewalls 3047 3048 The FTP protocol requires one of the involved parties to open a second 3049 connection as soon as data is about to get transferred. There are two ways to 3050 do this. 3051 3052 The default way for curl is to issue the PASV command which causes the 3053 server to open another port and await another connection performed by the 3054 client. This is good if the client is behind a firewall that doesn't allow 3055 incoming connections. 3056 3057 curl ftp.download.com 3058 3059 If the server, for example, is behind a firewall that doesn't allow connections 3060 on ports other than 21 (or if it just doesn't support the PASV command), the 3061 other way to do it is to use the PORT command and instruct the server to 3062 connect to the client on the given IP number and port (as parameters to the 3063 PORT command). 3064 3065 The -P flag to curl supports a few different options. Your machine may have 3066 several IP-addresses and/or network interfaces and curl allows you to select 3067 which of them to use. Default address can also be used: 3068 3069 curl -P - ftp.download.com 3070 3071 Download with PORT but use the IP address of our 'le0' interface (this does 3072 not work on windows): 3073 3074 curl -P le0 ftp.download.com 3075 3076 Download with PORT but use 192.168.0.10 as our IP address to use: 3077 3078 curl -P 192.168.0.10 ftp.download.com 3079 3080NETWORK INTERFACE 3081 3082 Get a web page from a server using a specified port for the interface: 3083 3084 curl --interface eth0:1 http://www.netscape.com/ 3085 3086 or 3087 3088 curl --interface 192.168.1.10 http://www.netscape.com/ 3089 3090HTTPS 3091 3092 Secure HTTP requires SSL libraries to be installed and used when curl is 3093 built. If that is done, curl is capable of retrieving and posting documents 3094 using the HTTPS protocol. 3095 3096 Example: 3097 3098 curl https://www.secure-site.com 3099 3100 Curl is also capable of using your personal certificates to get/post files 3101 from sites that require valid certificates. The only drawback is that the 3102 certificate needs to be in PEM-format. PEM is a standard and open format to 3103 store certificates with, but it is not used by the most commonly used 3104 browsers (Netscape and MSIE both use the so called PKCS#12 format). If you 3105 want curl to use the certificates you use with your (favourite) browser, you 3106 may need to download/compile a converter that can convert your browser's 3107 formatted certificates to PEM formatted ones. This kind of converter is 3108 included in recent versions of OpenSSL, and for older versions Dr Stephen 3109 N. Henson has written a patch for SSLeay that adds this functionality. You 3110 can get his patch (that requires an SSLeay installation) from his site at: 3111 http://www.drh-consultancy.demon.co.uk/ 3112 3113 Example on how to automatically retrieve a document using a certificate with 3114 a personal password: 3115 3116 curl -E /path/to/cert.pem:password https://secure.site.com/ 3117 3118 If you neglect to specify the password on the command line, you will be 3119 prompted for the correct password before any data can be received. 3120 3121 Many older SSL-servers have problems with SSLv3 or TLS, which newer versions 3122 of OpenSSL etc use, therefore it is sometimes useful to specify what 3123 SSL-version curl should use. Use -3, -2 or -1 to specify that exact SSL 3124 version to use (for SSLv3, SSLv2 or TLSv1 respectively): 3125 3126 curl -2 https://secure.site.com/ 3127 3128 Otherwise, curl will first attempt to use v3 and then v2. 3129 3130 To use OpenSSL to convert your favourite browser's certificate into a PEM 3131 formatted one that curl can use, do something like this: 3132 3133 In Netscape, you start with hitting the 'Security' menu button. 3134 3135 Select 'certificates->yours' and then pick a certificate in the list 3136 3137 Press the 'Export' button 3138 3139 enter your PIN code for the certs 3140 3141 select a proper place to save it 3142 3143 Run the 'openssl' application to convert the certificate. If you cd to the 3144 openssl installation, you can do it like: 3145 3146 # ./apps/openssl pkcs12 -in [file you saved] -clcerts -out [PEMfile] 3147 3148 In Firefox, select Options, then Advanced, then the Encryption tab, 3149 View Certificates. This opens the Certificate Manager, where you can 3150 Export. Be sure to select PEM for the Save as type. 3151 3152 In Internet Explorer, select Internet Options, then the Content tab, then 3153 Certificates. Then you can Export, and depending on the format you may 3154 need to convert to PEM. 3155 3156 In Chrome, select Settings, then Show Advanced Settings. Under HTTPS/SSL 3157 select Manage Certificates. 3158 3159RESUMING FILE TRANSFERS 3160 3161 To continue a file transfer where it was previously aborted, curl supports 3162 resume on HTTP(S) downloads as well as FTP uploads and downloads. 3163 3164 Continue downloading a document: 3165 3166 curl -C - -o file ftp://ftp.server.com/path/file 3167 3168 Continue uploading a document(*1): 3169 3170 curl -C - -T file ftp://ftp.server.com/path/file 3171 3172 Continue downloading a document from a web server(*2): 3173 3174 curl -C - -o file http://www.server.com/ 3175 3176 (*1) = This requires that the FTP server supports the non-standard command 3177 SIZE. If it doesn't, curl will say so. 3178 3179 (*2) = This requires that the web server supports at least HTTP/1.1. If it 3180 doesn't, curl will say so. 3181 3182TIME CONDITIONS 3183 3184 HTTP allows a client to specify a time condition for the document it 3185 requests. It is If-Modified-Since or If-Unmodified-Since. Curl allows you to 3186 specify them with the -z/--time-cond flag. 3187 3188 For example, you can easily make a download that only gets performed if the 3189 remote file is newer than a local copy. It would be made like: 3190 3191 curl -z local.html http://remote.server.com/remote.html 3192 3193 Or you can download a file only if the local file is newer than the remote 3194 one. Do this by prepending the date string with a '-', as in: 3195 3196 curl -z -local.html http://remote.server.com/remote.html 3197 3198 You can specify a "free text" date as condition. Tell curl to only download 3199 the file if it was updated since January 12, 2012: 3200 3201 curl -z "Jan 12 2012" http://remote.server.com/remote.html 3202 3203 Curl will then accept a wide range of date formats. You always make the date 3204 check the other way around by prepending it with a dash '-'. 3205 3206DICT 3207 3208 For fun try 3209 3210 curl dict://dict.org/m:curl 3211 curl dict://dict.org/d:heisenbug:jargon 3212 curl dict://dict.org/d:daniel:web1913 3213 3214 Aliases for 'm' are 'match' and 'find', and aliases for 'd' are 'define' 3215 and 'lookup'. For example, 3216 3217 curl dict://dict.org/find:curl 3218 3219 Commands that break the URL description of the RFC (but not the DICT 3220 protocol) are 3221 3222 curl dict://dict.org/show:db 3223 curl dict://dict.org/show:strat 3224 3225 Authentication is still missing (but this is not required by the RFC) 3226 3227LDAP 3228 3229 If you have installed the OpenLDAP library, curl can take advantage of it 3230 and offer ldap:// support. 3231 3232 LDAP is a complex thing and writing an LDAP query is not an easy task. I do 3233 advise you to dig up the syntax description for that elsewhere. Two places 3234 that might suit you are: 3235 3236 Netscape's "Netscape Directory SDK 3.0 for C Programmer's Guide Chapter 10: 3237 Working with LDAP URLs": 3238 http://developer.netscape.com/docs/manuals/dirsdk/csdk30/url.htm 3239 3240 RFC 2255, "The LDAP URL Format" http://curl.haxx.se/rfc/rfc2255.txt 3241 3242 To show you an example, this is how I can get all people from my local LDAP 3243 server that has a certain sub-domain in their email address: 3244 3245 curl -B "ldap://ldap.frontec.se/o=frontec??sub?mail=*sth.frontec.se" 3246 3247 If I want the same info in HTML format, I can get it by not using the -B 3248 (enforce ASCII) flag. 3249 3250ENVIRONMENT VARIABLES 3251 3252 Curl reads and understands the following environment variables: 3253 3254 http_proxy, HTTPS_PROXY, FTP_PROXY 3255 3256 They should be set for protocol-specific proxies. General proxy should be 3257 set with 3258 3259 ALL_PROXY 3260 3261 A comma-separated list of host names that shouldn't go through any proxy is 3262 set in (only an asterisk, '*' matches all hosts) 3263 3264 NO_PROXY 3265 3266 If the host name matches one of these strings, or the host is within the 3267 domain of one of these strings, transactions with that node will not be 3268 proxied. 3269 3270 3271 The usage of the -x/--proxy flag overrides the environment variables. 3272 3273NETRC 3274 3275 Unix introduced the .netrc concept a long time ago. It is a way for a user 3276 to specify name and password for commonly visited FTP sites in a file so 3277 that you don't have to type them in each time you visit those sites. You 3278 realize this is a big security risk if someone else gets hold of your 3279 passwords, so therefore most unix programs won't read this file unless it is 3280 only readable by yourself (curl doesn't care though). 3281 3282 Curl supports .netrc files if told to (using the -n/--netrc and 3283 --netrc-optional options). This is not restricted to just FTP, 3284 so curl can use it for all protocols where authentication is used. 3285 3286 A very simple .netrc file could look something like: 3287 3288 machine curl.haxx.se login iamdaniel password mysecret 3289 3290CUSTOM OUTPUT 3291 3292 To better allow script programmers to get to know about the progress of 3293 curl, the -w/--write-out option was introduced. Using this, you can specify 3294 what information from the previous transfer you want to extract. 3295 3296 To display the amount of bytes downloaded together with some text and an 3297 ending newline: 3298 3299 curl -w 'We downloaded %{size_download} bytes\n' www.download.com 3300 3301KERBEROS FTP TRANSFER 3302 3303 Curl supports kerberos4 and kerberos5/GSSAPI for FTP transfers. You need 3304 the kerberos package installed and used at curl build time for it to be 3305 available. 3306 3307 First, get the krb-ticket the normal way, like with the kinit/kauth tool. 3308 Then use curl in way similar to: 3309 3310 curl --krb private ftp://krb4site.com -u username:fakepwd 3311 3312 There's no use for a password on the -u switch, but a blank one will make 3313 curl ask for one and you already entered the real password to kinit/kauth. 3314 3315TELNET 3316 3317 The curl telnet support is basic and very easy to use. Curl passes all data 3318 passed to it on stdin to the remote server. Connect to a remote telnet 3319 server using a command line similar to: 3320 3321 curl telnet://remote.server.com 3322 3323 And enter the data to pass to the server on stdin. The result will be sent 3324 to stdout or to the file you specify with -o. 3325 3326 You might want the -N/--no-buffer option to switch off the buffered output 3327 for slow connections or similar. 3328 3329 Pass options to the telnet protocol negotiation, by using the -t option. To 3330 tell the server we use a vt100 terminal, try something like: 3331 3332 curl -tTTYPE=vt100 telnet://remote.server.com 3333 3334 Other interesting options for it -t include: 3335 3336 - XDISPLOC=<X display> Sets the X display location. 3337 3338 - NEW_ENV=<var,val> Sets an environment variable. 3339 3340 NOTE: The telnet protocol does not specify any way to login with a specified 3341 user and password so curl can't do that automatically. To do that, you need 3342 to track when the login prompt is received and send the username and 3343 password accordingly. 3344 3345PERSISTENT CONNECTIONS 3346 3347 Specifying multiple files on a single command line will make curl transfer 3348 all of them, one after the other in the specified order. 3349 3350 libcurl will attempt to use persistent connections for the transfers so that 3351 the second transfer to the same host can use the same connection that was 3352 already initiated and was left open in the previous transfer. This greatly 3353 decreases connection time for all but the first transfer and it makes a far 3354 better use of the network. 3355 3356 Note that curl cannot use persistent connections for transfers that are used 3357 in subsequence curl invokes. Try to stuff as many URLs as possible on the 3358 same command line if they are using the same host, as that'll make the 3359 transfers faster. If you use an HTTP proxy for file transfers, practically 3360 all transfers will be persistent. 3361 3362MULTIPLE TRANSFERS WITH A SINGLE COMMAND LINE 3363 3364 As is mentioned above, you can download multiple files with one command line 3365 by simply adding more URLs. If you want those to get saved to a local file 3366 instead of just printed to stdout, you need to add one save option for each 3367 URL you specify. Note that this also goes for the -O option (but not 3368 --remote-name-all). 3369 3370 For example: get two files and use -O for the first and a custom file 3371 name for the second: 3372 3373 curl -O http://url.com/file.txt ftp://ftp.com/moo.exe -o moo.jpg 3374 3375 You can also upload multiple files in a similar fashion: 3376 3377 curl -T local1 ftp://ftp.com/moo.exe -T local2 ftp://ftp.com/moo2.txt 3378 3379IPv6 3380 3381 curl will connect to a server with IPv6 when a host lookup returns an IPv6 3382 address and fall back to IPv4 if the connection fails. The --ipv4 and --ipv6 3383 options can specify which address to use when both are available. IPv6 3384 addresses can also be specified directly in URLs using the syntax: 3385 3386 http://[2001:1890:1112:1::20]/overview.html 3387 3388 When this style is used, the -g option must be given to stop curl from 3389 interpreting the square brackets as special globbing characters. Link local 3390 and site local addresses including a scope identifier, such as fe80::1234%1, 3391 may also be used, but the scope portion must be numeric and the percent 3392 character must be URL escaped. The previous example in an SFTP URL might 3393 look like: 3394 3395 sftp://[fe80::1234%251]/ 3396 3397 IPv6 addresses provided other than in URLs (e.g. to the --proxy, --interface 3398 or --ftp-port options) should not be URL encoded. 3399 3400METALINK 3401 3402 Curl supports Metalink (both version 3 and 4 (RFC 5854) are supported), a way 3403 to list multiple URIs and hashes for a file. Curl will make use of the mirrors 3404 listed within for failover if there are errors (such as the file or server not 3405 being available). It will also verify the hash of the file after the download 3406 completes. The Metalink file itself is downloaded and processed in memory and 3407 not stored in the local file system. 3408 3409 Example to use a remote Metalink file: 3410 3411 curl --metalink http://www.example.com/example.metalink 3412 3413 To use a Metalink file in the local file system, use FILE protocol (file://): 3414 3415 curl --metalink file://example.metalink 3416 3417 Please note that if FILE protocol is disabled, there is no way to use a local 3418 Metalink file at the time of this writing. Also note that if --metalink and 3419 --include are used together, --include will be ignored. This is because including 3420 headers in the response will break Metalink parser and if the headers are included 3421 in the file described in Metalink file, hash check will fail. 3422 3423MAILING LISTS 3424 3425 For your convenience, we have several open mailing lists to discuss curl, 3426 its development and things relevant to this. Get all info at 3427 http://curl.haxx.se/mail/. Some of the lists available are: 3428 3429 curl-users 3430 3431 Users of the command line tool. How to use it, what doesn't work, new 3432 features, related tools, questions, news, installations, compilations, 3433 running, porting etc. 3434 3435 curl-library 3436 3437 Developers using or developing libcurl. Bugs, extensions, improvements. 3438 3439 curl-announce 3440 3441 Low-traffic. Only receives announcements of new public versions. At worst, 3442 that makes something like one or two mails per month, but usually only one 3443 mail every second month. 3444 3445 curl-and-php 3446 3447 Using the curl functions in PHP. Everything curl with a PHP angle. Or PHP 3448 with a curl angle. 3449 3450 curl-and-python 3451 3452 Python hackers using curl with or without the python binding pycurl. 3453 3454 Please direct curl questions, feature requests and trouble reports to one of 3455 these mailing lists instead of mailing any individual. 3456