Closed Bug 859059 Opened 11 years ago Closed 10 years ago

Implement "Copy as curl"

Categories

(DevTools :: Netmonitor, defect, P2)

defect

Tracking

(Not tracked)

RESOLVED FIXED
Firefox 31

People

(Reporter: vporof, Assigned: thomas)

References

Details

Attachments

(1 file, 20 obsolete files)

39.15 KB, patch
vporof
: review+
Details | Diff | Splinter Review
      No description provided.
Moving into Developer Tools: Netmonitor component. Filter on NETMONITORAMA.
Component: Developer Tools → Developer Tools: Netmonitor
Summary: [netmonitor] Implement "Copy as curl" → Implement "Copy as curl"
Priority: -- → P2
This should be a context menu item on the requests menu, under the "Copy URL" item, and should maybe look like "Copy as cURL" to go along with it's official capitalization: http://en.wikipedia.org/wiki/CURL. See Chrome's functionality too.
Please assign this to me. as i see it the command should be copied to clipboard with all headers with the -H flag.
It's yours!
Assignee: nobody → avinash
... what is this bug supposed to do? Seems obvious to everyone but me, however the bug summary mentions nothing. Just copy the URL + a curl command prefix?
(In reply to Nick Fitzgerald [:fitzgen] from comment #5)
> ... what is this bug supposed to do? Seems obvious to everyone but me,
> however the bug summary mentions nothing. Just copy the URL + a curl command
> prefix?

Also headers.
Also add " --compressed" in case of gzipping. I think Chrome adds it every time, but maybe we can case it? Someone would know more about that.
Attached patch 859059.patch (obsolete) — Splinter Review
I couldn't figure out when to add the --data-binary option to the cURL request so have attached the rest of the patch and once i come to know of that i'll add that part too.
Attachment #786198 - Flags: review?(fayearthur)
Comment on attachment 786198 [details] [diff] [review]
859059.patch

Changing review to Victor, I feel like I would miss some things here.
Attachment #786198 - Flags: review?(fayearthur) → review?(vporof)
Comment on attachment 786198 [details] [diff] [review]
859059.patch

Review of attachment 786198 [details] [diff] [review]:
-----------------------------------------------------------------

I'm not a curl ninja, but here are a few things you should also add:
--http1.0 if necessary
--cookie for request cookies
--data-urlencode for url encoded post data params
--form for multipart/form data
--data-binary to more nicely handle binary data in post requests
--no-keepalive if the headers specify this

It's discussable if adding the following is a good idea or not:
--user-agent which in this case would be Mozilla...
--referer if there is any (i think you can get this from headers)
--location if the response code was 3xx
--keepalive-time and/or --connect-timeout if the headers specify this

We should talk about this in the channel, or ask some smarter people what they think :)

You'll also need to add a test. Look at the other context menu tests for inspiration (e.g. for "Copy URL").

::: browser/devtools/netmonitor/netmonitor-view.js
@@ +379,5 @@
> +    let selected = this.selectedItem.attachment;
> +    let ignoredHeaders = {};
> +    command.push(selected.url);
> +
> +    if(selected.method !== "GET") {

Nit: add a space before (

@@ +386,5 @@
> +    }
> +
> +    if (selected.requestPostData) {
> +      command.push("--data");
> +      command.push('"' + selected.requestPostData.postData.text + '"');

"text" is a long string. You'll need to fetch it with gNetwork.getString

@@ +387,5 @@
> +
> +    if (selected.requestPostData) {
> +      command.push("--data");
> +      command.push('"' + selected.requestPostData.postData.text + '"');
> +      ignoredHeaders["Content-Length"] = true;

It doesn't look like you're using this anywhere.
Attachment #786198 - Flags: review?(vporof) → feedback+
"I couldn't figure out when to add the --data-binary" 

--data-binary is used for multipart/form-data requests

The way Dev Tools and Firebug handles this is to add the payload / post data to the argument.
For file input parts, the part data should only contain the only Content-Disposition and Content-Type lines. No raw binary data.

Example output from Firebug's implementation (formatted for reading):

--data-binary $'
-----------------------------610788659553159153489176608\r\n
Content-Disposition: form-data; name="param1"\r\n\r\n
123\r\n
-----------------------------610788659553159153489176608\r\n
Content-Disposition: form-data; name="file2"; filename="binary-87114-43102.png"\r\n
Content-Type: image/png\r\n\r\n
-----------------------------610788659553159153489176608\r\n
Content-Disposition: form-data; name="html"\r\n\r\n
Text echoed back to request\r\n
-----------------------------610788659553159153489176608--\r\n'

The end user has to make sure the filename has valid local file paths 


Multipart/form-data case: 
https://github.com/firebug/firebug/blob/685c3829dae8b796757ba253cb9f86336e55136f/extension/content/firebug/net/netUtils.js#L607-L614

Parse the post text: 
https://github.com/firebug/firebug/blob/685c3829dae8b796757ba253cb9f86336e55136f/extension/content/firebug/net/netUtils.js#L652-L693

Bug:
http://code.google.com/p/fbug/issues/detail?id=6817

Hope this helps,

Thomas
Hi Avinash

are you still working on this?
Hi! I had totally forgot about this bug. I can work in this in some days, unless someone wants to take and work on this. Sorry, for forgetting. My bad.
ok, I would like to add to this task.
Starting with implementing Victor's first list from comment 11 in this thread
Attached patch bug859059-initial.patch (obsolete) — Splinter Review
Hi

Here is a POC for copy as curl.
Please see this as a POC. The code needs improvements.

The following cURL command string is genereated:

curl
  url                                   - request url
  [-X 'HEAD']                           - request method. Added when the request method is not GET/POST
  [--data 'param1=123&param2=cheese..'] - post request parameters
  [--data-binary $'--...-1231...']      - multipart request parameters
  [--http/VERSION]                      - http VERISON. Added when version is not default version
  -H 'Connection: keep-alive'...        - request header. One -H argument pr. requestHeader

Simple page for some manual testing: http://jsfiddle.net/HKnw3/

For now everything to added to the netmonitor-view.js/Curl object. Hopefully this will ease the review.
Most of the functions except for generateCommand() are helper functions which deals with the postData payload text.
I guess some of this can be moved to the top level or to a helper

My wish for a next step one is to re-organize the code so I would need some direction for where the code should live. 
What do you think?
Flags: needinfo?(vporof)
Comment on attachment 8346247 [details] [diff] [review]
bug859059-initial.patch

Review of attachment 8346247 [details] [diff] [review]:
-----------------------------------------------------------------

Sorry about the late reply :) This looks gorgeous, it's very clean and organized. Thank you Thomas, a great start! Make sure the licensing concerns have been dealt with if you're reusing code from other places (should be compatible with MPL 2.0).

Let me know if you need help with anything else, like writing tests.

::: browser/devtools/netmonitor/netmonitor-view.js
@@ +2050,5 @@
>  };
>  
> +/* Curl START */
> +
> +let Curl = {

You could move this into netmonitor-controller.js, or even better: having a separate file in browser/devtools/shared/Curl.jsm

You'd then best lazily import this in the controller, using 
XPCOMUtils.defineLazyModuleGetter(this, "Curl", "resource://gre/modules/Curl.jsm");

@@ +2114,5 @@
> +      }
> +      command.push("-H");
> +      command.push(escapeString(header.name + ": " + header.value));
> +    }
> +    

Nit: be careful about not leaving trailing whitespace.

@@ +2138,5 @@
> +    let requestPostData = aData.requestPostData;
> +    if (!(requestPostData && typeof requestPostData.postData.text == "string")) {
> +      return false;
> +    }
> +    if (requestPostData.postData.text.toLowerCase().startsWith("content-type: application/x-www-form-urlencoded")) {

Nit: I'm not an advocate of the 80 chars per line rule, but we generally try to keep things reasonably succinct. Split this and similar lines a bit by declaring a few variables.
Attachment #8346247 - Flags: feedback+
Flags: needinfo?(vporof)
Thank you for the review Victor

Regarding licenses. I am not a license expert, but Firebug's BSD license[1] should be compatible with MPL 2.0

I am not sure about the license for the two functions from the Chromium/Blink project 'escapeStringPosix' and 'escapeStringWin' and need some help on that.

The copyright header in the file containing the js functions does not say anything about license and I can not find any specifics when traversing up in the file tree[2].
As far as I know the blink project is dual licensed with BSD v2.0 and GNU LGPL v2.1[3] which immediately _sounds compatible with MPL 2.0
If I remember correctly much of the code here is derived from Chromium.

As I interpret Blink's NetworkPanel.js header we are 'free to go' as long as the copyright is retained. 

But of-course I need some solid confirmation on this. Can you check this further?

[1] https://github.com/firebug/firebug/blob/master/extension/license.txt
[2] https://chromium.googlesource.com/chromium/blink.git/+/master/Source/devtools/front_end/NetworkPanel.js
[3] http://en.wikipedia.org/wiki/Blink_%28layout_engine%29

Regarding tests, I could try looking at the browser/devtools/netmonitor/test/browser_net_copy_url.js and use that as a template for this test.
I could also try to set up some sensible unit tests for the Curl object's functions.
Let me know if you have different and more thoughts on this
I'm not a license expert either, so asking Gerv to double check this. We're probably fine, but it's good to double check. Gerv, please see comment 18.
Flags: needinfo?(gerv)
(In reply to Thomas Andersen from comment #18)
> Thank you for the review Victor
> 
> Regarding licenses. I am not a license expert, but Firebug's BSD license[1]
> should be compatible with MPL 2.0

If the code is in a separate file, no further action is necessary. This is the simplest route. If you are putting Firebug code into the same file as MPLed code, we either need to add extra headers or (better, if simple) ask the original author if relicensing is OK.

So if you tell me more about how you are combining the code, I can advise.

Gerv
Flags: needinfo?(gerv)
Thanks for the reply Gervase,

Right now I have one file/object which implements one object.
That object contains helper code borrowed from the Blink project and Firebug project.
Firefox/DevTools code has a 'direct link' to that file/object. 

Right now it looks like this:

browser/devtools/shared/Curl.jsm - implementation containing the code in question
browser/devtools/netmonitor/netmonitor-controller.js - points to the Curl object
browser/devtools/netmonitor/netmonitor-view.js - calls a method from Curl

Let me know if it's unclear
I have the second revision of the code addressing code issues discussed here and can create an attachment. 
Give me a "go" and I'll create the patch.

Thomas
> If the code is in a separate file, no further action is necessary.
Sorry, I missed that in my previous comment

I have looked at and recognize Firebug code in the existing network-helper.js[1] file. It seems to be in the same boat as this file so maybe I could use that header as a template.

Eg.

* Blink's comment header: https://chromium.googlesource.com/chromium/blink.git/+/master/Source/devtools/front_end/NetworkPanel.js
* Plus attributions to the Firebug working group

[1]https://hg.mozilla.org/mozilla-central/file/fffa3d03527e/toolkit/devtools/webconsole/network-helper.js

I guess it is ok to continue code review. If the license issue is not resolved successfully we'll just have to try finding another way.
Example header for /browser/devtools/shared/Curl.jsm 

/* -*- Mode: javascript; tab-width: 2; indent-tabs-mode: nil; c-basic-offset: 2 -*- */
/* vim: set ft=javascript ts=2 et sw=2 tw=80: */
/* This Source Code Form is subject to the terms of the Mozilla Public
 * License, v. 2.0. If a copy of the MPL was not distributed with this
 * file, You can obtain one at http://mozilla.org/MPL/2.0/. */

/*
 * Copyright (C) 2007, 2008 Apple Inc.  All rights reserved.
 * Copyright (C) 2008, 2009 Anthony Ricaud <rik@webkit.org>
 * Copyright (C) 2011 Google Inc. All rights reserved.
 *
 * Redistribution and use in source and binary forms, with or without
 * modification, are permitted provided that the following conditions
 * are met:
 *
 * 1.  Redistributions of source code must retain the above copyright
 *     notice, this list of conditions and the following disclaimer.
 * 2.  Redistributions in binary form must reproduce the above copyright
 *     notice, this list of conditions and the following disclaimer in the
 *     documentation and/or other materials provided with the distribution.
 * 3.  Neither the name of Apple Computer, Inc. ("Apple") nor the names of
 *     its contributors may be used to endorse or promote products derived
 *     from this software without specific prior written permission.
 *
 * THIS SOFTWARE IS PROVIDED BY APPLE AND ITS CONTRIBUTORS "AS IS" AND ANY
 * EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
 * WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
 * DISCLAIMED. IN NO EVENT SHALL APPLE OR ITS CONTRIBUTORS BE LIABLE FOR ANY
 * DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
 * (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
 * LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
 * ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
 * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
 * THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
 */

/*
 * Software License Agreement (BSD License)
 * 
 * Copyright (c) 2009, Mozilla Foundation
 * All rights reserved.
 * 
 * Redistribution and use of this software in source and binary forms, with or without modification,
 * are permitted provided that the following conditions are met:
 * 
 * * Redistributions of source code must retain the above
 *   copyright notice, this list of conditions and the
 *   following disclaimer.
 * 
 * * Redistributions in binary form must reproduce the above
 *   copyright notice, this list of conditions and the
 *   following disclaimer in the documentation and/or other
 *   materials provided with the distribution.
 * 
 * * Neither the name of Mozilla Foundation nor the names of its
 *   contributors may be used to endorse or promote products
 *   derived from this software without specific prior
 *   written permission of Mozilla Foundation.
 * 
 * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR
 * IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND
 * FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
 * CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
 * DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
 * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER
 * IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT
 * OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/

/*
 * Creator:
 *  Thomas Andersen <thomas@mr-andersen.no>
 * Contributors
 *  Google Inc.
 *  Firebug Working Group
 */

.. Code

Part 1: Standard MPL2.0
Part 2: Google's license
Part 3: Firebug's license
Part 4: Credits

As you can see in Firebug's license it is copyrighted by Mozilla Foundation. I am not sure if this part is needed and maybe just the credits will do.
I can contact the Firebug project lead for more information.

Again, let me know and I can add a patch of the current code for a better overview. 

Thomas
Flags: needinfo?(vporof)
Flags: needinfo?(gerv)
I'll let Gerv decide what's best here.
Flags: needinfo?(vporof)
Does this file contain any _MPLed_ code taken from other sources?

If not, I suggest using the following license header, which is permitted because (wonderful!) the two licenses have the same text, and I'm waiving clause 3 for MoFo (which has no real effect - you can't use anyone's name in that way without permission anyway) so we don't have to include our name in there:

/*
 * Copyright (C) 2007, 2008 Apple Inc.  All rights reserved.
 * Copyright (C) 2008, 2009 Anthony Ricaud <rik@webkit.org>
 * Copyright (C) 2011 Google Inc. All rights reserved.
 * Copyright (C) 2009 Mozilla Foundation. All rights reserved.
 *
 * Redistribution and use in source and binary forms, with or without
 * modification, are permitted provided that the following conditions
 * are met:
 *
 * 1.  Redistributions of source code must retain the above copyright
 *     notice, this list of conditions and the following disclaimer.
 * 2.  Redistributions in binary form must reproduce the above copyright
 *     notice, this list of conditions and the following disclaimer in the
 *     documentation and/or other materials provided with the distribution.
 * 3.  Neither the name of Apple Computer, Inc. ("Apple") nor the names of
 *     its contributors may be used to endorse or promote products derived
 *     from this software without specific prior written permission.
 *
 * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
 * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 
 * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
 * DISCLAIMED. IN NO EVENT SHALL APPLE OR ITS CONTRIBUTORS BE LIABLE FOR ANY
 * DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
 * (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
 * LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
 * ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
 * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
 * THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
 */

Mozilla strongly discourages (for various reasons) per-file credit information and also (for various other reasons) per-file copyright ownership information. We have to keep some of the latter here because of what the licence says, but let's not add more.

Gerv
Flags: needinfo?(gerv)
> Does this file contain any _MPLed_ code taken from other sources?

No
Flags: needinfo?(gerv)
OK. So please use the header as I've given it in comment 25, and don't copy any MPLed code into the file.

Gerv
Flags: needinfo?(gerv)
Attached patch bug859059-rev2.patch (obsolete) — Splinter Review
hi again, this is revision 2 based on feedback from comment 17 and the license discussion.

* Code moved to browser/devtools/shared/Curl.jsm file
* Applied the license header mentioned in comment 25
* The file is lazy imported from netmonitor-controller.js

I will start looking at creating some browser tests next. 
A plain test page used for development is set up here http://mr-andersen.no/curl/

Question:
Should we add the -i (include) flag for HEAD requests?

Plain cURL HEAD requests is not particular useful from a client perspective as the server usually just serves the document header.

curl 'http://www.mozilla.org' -X HEAD -i

Awaiting further feedback

thomas
Attachment #8359471 - Flags: review?(vporof)
Attached patch bug859059-rev3.patch (obsolete) — Splinter Review
removed some unwanted white space
Attachment #8346247 - Attachment is obsolete: true
Attachment #8359471 - Attachment is obsolete: true
Attachment #8359471 - Flags: review?(vporof)
Attachment #8359478 - Flags: review?(vporof)
Nice work, Thomas. Thanks for this.

Just for the record, we imported and modified some Firebug code previously without any issue. It passed legal review and Firebug's license was deemed MPL compatible at the time.

cheers!
Attached patch bug859059-rev4.patch (obsolete) — Splinter Review
removed log statements
Attachment #8359478 - Attachment is obsolete: true
Attachment #8359478 - Flags: review?(vporof)
Attachment #8362459 - Flags: review?(vporof)
Comment on attachment 8362459 [details] [diff] [review]
bug859059-rev4.patch

Review of attachment 8362459 [details] [diff] [review]:
-----------------------------------------------------------------

I very much like this a lot! Great progress and thanks for all the work. Sorry for the licensing issues taking so long, but it's better to be careful than sorry; hope this didn't prove to be too boring :) Apart from a few comments below (mostly about "long strings"), everything looks good to me. Let me know how I can help you with tests. Positive feedback from me.

Mihai, please take a look through this file and let us know if there's any web console shared code that we can use instead of reimplementing it here. Also, kindly double-check the utility methods in the Curl object to see if they do what they're supposed to. I looked through them and they all seem to have the right logic, but I think it'd be a good idea if you double checked, if you have the time.

::: browser/devtools/shared/Curl.jsm
@@ +45,5 @@
> +  /**
> +   * Generate a curl command based on the provided aData
> +   *
> +   * @param object aData
> +   *        The data source (this should be the attachment of a request item).

I suggest you document how the data source should look like in this documentation. Add a little bit of detail about how the object looks like, what properties it should have, what's supported etc.

@@ +67,5 @@
> +
> +    // Create post data
> +    let postDataText;
> +    if (isUrlEncodedRequest || aData.method == "PUT") {
> +      postDataText = aData.requestPostData.postData.text;

The problem is that |text| is a long string, which is not initially fully sent via the remote protocol to the client to save bandwidth. To receive the full contents, you'll need to fetch it. I suggest that the generateCommand method should not take a vanilla "attachment", but a sanitized object (also with less nesting!) which contains all the data needed to generate the curl command. Consumers of this library would have to build this object themselves.

See getString in netmonitor-controller.js (and its use in -view.js) to see how to fetch the full text content of long strings.

If you're interested, here's some documentation about the web console remoting api and what packets represent: https://developer.mozilla.org/en-US/docs/Tools/Web_Console/remoting

Also, if you're interested for more details, here's the long string actor implementation: http://dxr.mozilla.org/mozilla-central/source/toolkit/devtools/server/actors/string.js

@@ +73,5 @@
> +      data.push(escapeString(this.writePostDataTextParams(postDataText)));
> +      ignoredHeaders["Content-Length"] = true;
> +      inferredMethod = "POST";
> +    } else if (isMultipartRequest) {
> +      postDataText = aData.requestPostData.postData.text.initial;

Same here. Long strings!

@@ +85,5 @@
> +      command.push("-X");
> +      command.push(aData.method);
> +    }
> +
> +    // Add http version

Nit: please add a colon (.) after each sentence in comments.

@@ +103,5 @@
> +      if (header.name in ignoredHeaders) {
> +        continue;
> +      }
> +      command.push("-H");
> +      command.push(escapeString(header.name + ": " + header.value));

header.value is, you guessed it, a long string.

@@ +108,5 @@
> +    }
> +
> +    // Add post data
> +    command = command.concat(data);
> +    

Nit: make sure there's no trailing or redundant whitespace. There's some whitespace at the beginning of the line here, and in a few other places in the file.

@@ +110,5 @@
> +    // Add post data
> +    command = command.concat(data);
> +    
> +    // Add --compressed
> +    command.push("--compressed");

Do we need to always add --compressed? Why? Would an option be suitable here?

@@ +117,5 @@
> +  },
> +  
> +  /**
> +   * Check if the attachment is URL encoded.
> +   * Move to helper?

I would suggest you make this and any other utility functions that are not currently used by any consumers private. That is, just add a _ prefix, and responsible programmers won't call them :) Alternatively, you could just move all of them outside of the Curl object and *not* export them, then they won't ever be accessible.

@@ +126,5 @@
> +   *         True if the attachment is URL encoded, false otherwise.
> +   */
> +  isUrlEncodedRequest: function(aData) {
> +    let requestPostData = aData.requestPostData;
> +    if (!(requestPostData && typeof requestPostData.postData.text == "string")) {

Long strings :)

@@ +129,5 @@
> +    let requestPostData = aData.requestPostData;
> +    if (!(requestPostData && typeof requestPostData.postData.text == "string")) {
> +      return false;
> +    }
> +    let postText = requestPostData.postData.text.toLowerCase();

Strings that are long :)

@@ +130,5 @@
> +    if (!(requestPostData && typeof requestPostData.postData.text == "string")) {
> +      return false;
> +    }
> +    let postText = requestPostData.postData.text.toLowerCase();
> +    if (postText.startsWith("content-type: application/x-www-form-urlencoded")) {

Does it have to start with that? Wouldn't it be better to simply check for .contains "x-www-form-urlencoded", or maybe even regex? That's what we do in netmonitor-view.js for _setRequestPostParams().

@@ +135,5 @@
> +      return true;
> +    }
> +    let contentType = this.findHeader(aData.requestHeaders.headers, "content-type");
> +    
> +    return (contentType && contentType.startsWith("application/x-www-form-urlencoded"));

Again, .contains vs. .startsWith vs. regex. I'm interested in what you think.

@@ +149,5 @@
> +   *         True if the attachment is url encoded, false otherwise.
> +   */
> +  isMultipartRequest: function(aData) {
> +    let requestPostData = aData.requestPostData;
> +    if (!(requestPostData && requestPostData.postData.text.initial)) {

Long strings, long strings everywhere!

@@ +153,5 @@
> +    if (!(requestPostData && requestPostData.postData.text.initial)) {
> +      return false;
> +    }
> +    let postText = requestPostData.postData.text.initial.toLowerCase();
> +    if (postText.startsWith("content-type: multipart/form-data")) {

.contains as opposed to .startsWith vs. regex

@@ +193,5 @@
> +    for (let i = 0; i < aHeaders.length; i++) {
> +      let header = aHeaders[i];
> +      let headerName = header.name.toLowerCase();
> +      if (headerName == name) {
> +        return header.value;

header.value is a long string. I think you get the idea so I'll stop repeating myself.

@@ +222,5 @@
> +        // filename= tells us that the part is a file input type.
> +        if (/filename=/im.test(contentDispositionLine)) {
> +          // For file input parts
> +          // Remove binary data. Only the Content-Disposition and Content-Type lines
> +          // should remain.

This is a funky formatted comment.

@@ +238,5 @@
> +
> +  /**
> +   * Finds headers in post data text.
> +   * Credit: firebug/lib/http
> +   * Move to helper?

I think there might be some utilities used by the web console actors that we can reuse here. Deferring to msucan to take a look at the file. Reusing as much code as possible is the best idea here.

@@ +356,5 @@
> +   * Check if operating system is Windows
> +   *
> +   * @return boolean
> +   *         True if operating system is Windows, else false
> +   *         https://developer.mozilla.org/en-US/docs/Code_snippets/Miscellaneous#Operating_system_detection               

I don't think you have to add a link to Code_snippets here. Using Cc, Ci and .getService is pretty standard across mozilla code.
Attachment #8362459 - Flags: review?(vporof) → feedback?(mihai.sucan)
(In reply to Thomas Andersen from comment #28)
> Created attachment 8359471 [details] [diff] [review]
> 
> Question:
> Should we add the -i (include) flag for HEAD requests?
> 

It depends on what the other browsers do in this case, and if there's a de-facto behavior that might be a good idea to respect ourselves. I would vote for having -i.
(In reply to Victor Porof [:vp] from comment #33)
> (In reply to Thomas Andersen from comment #28)
> > Created attachment 8359471 [details] [diff] [review]
> > 
> > Question:
> > Should we add the -i (include) flag for HEAD requests?
> > 
> 
> It depends on what the other browsers do in this case, and if there's a
> de-facto behavior that might be a good idea to respect ourselves. I would
> vote for having -i.

I definitely want -i. I was surprised that I didn't get the response headers when I tried the curl command copied from the netmonitor.

Thomas: this is really good stuff!
Comment on attachment 8362459 [details] [diff] [review]
bug859059-rev4.patch

Review of attachment 8362459 [details] [diff] [review]:
-----------------------------------------------------------------

Tested Chrome and they do not include -i which seems rather sad, IMO. I still vote for -i, because we do not have to do things identical to Chrome.

Victor rightly points out this patch needs to be updated to correctly handle long strings for all postData usage and header.values. Check the web console remoting docs to see which packets can be long strings.

Regarding the use of existing code from the network logging actors there are several points to make:

- we have bug 812616 which is about improving the handling of multipart POST data and form data. We do not do *anything* at all for these cases in the NetworkMonitor implementation (see toolkit/devtools/webconsole/utils.js).

- we have a crude parser for form data in network-panel.js (hg locate it). Please see _fromDataRegExp, _isRequestBodyFormData() and _displayRequestForm().

- Victor wrote code for handling form data as well, in netmonitor-view.js.

Victor: do you have something for multipart post data or is that displayed raw?

- we should fix bug 812616, by parsing/preparing the query string params, the POST form data and the multipart data, in JSON packets that we can send from the server to the client, using the network logging actors. In this way we can avoid code duplication and bugs.

You could try to reuse code from the network panel or from the network monitor, as you see fit, but you might need to make small changes.

What I suggest: see if you can easily reuse anything from existing code, if yes, do so. Otherwise, add a comment about this in bug 812616 so we do not forget about these problems. You dont have to bother too much about this situation.

Overall, this patch is good. f+ and thank you for the work!

::: browser/devtools/netmonitor/netmonitor.xul
@@ +32,5 @@
>                  accesskey="&netmonitorUI.context.copyUrl.accesskey;"
>                  oncommand="NetMonitorView.RequestsMenu.copyUrl();"/>
> +      <menuitem id="request-menu-context-copy-as-curl"
> +                label="&netmonitorUI.context.copyAsCurl;"
> +                accesskey=""

If you do not specify an accesskey please leave out the attribute. I would prefer that you assign an access key to the menu item.

::: browser/devtools/shared/Curl.jsm
@@ +51,5 @@
> +   *         A cURL command
> +   */
> +  generateCommand: function(aData) {
> +    let command = ["curl"];
> +    let ignoredHeaders = {};

nit: You could use a Set here.

@@ +94,5 @@
> +    // Add request headers
> +    let headers = aData.requestHeaders.headers;
> +    if (postDataText) {
> +      let postRequestHeaders = this.getHeadersFromPostText(postDataText);
> +      headers = headers.concat(postRequestHeaders);

nit: to avoid array copy, you could use headers.push.apply(headers, postRequestHeaders);

@@ +209,5 @@
> +   *        Post data text.
> +   * @return string
> +   *         Post text without binary data.
> +   */
> +  removeBinaryDataFromMultipartPostText: function(aText) {

It might be a good idea to go a bit through rfc2616 to see if this method is missing anything important.

@@ +211,5 @@
> +   *         Post text without binary data.
> +   */
> +  removeBinaryDataFromMultipartPostText: function(aText) {
> +    let boundaryRe = /^--.+/gm;
> +    let boundaryString = boundaryRe.exec(aText)[0];

The boundary string should be in content-type, as a parameter ;boundary=foobar. Maybe I'm mistaken, please double check. If possible, then it should be safer to get the boundary from the header value.

http://www.w3.org/Protocols/rfc2616/rfc2616-sec19.html#sec19.2

@@ +223,5 @@
> +        if (/filename=/im.test(contentDispositionLine)) {
> +          // For file input parts
> +          // Remove binary data. Only the Content-Disposition and Content-Type lines
> +          // should remain.
> +          let headersRe = /[\r\n]+Content-Disposition.+$[\r\n]+Content-Type.+$[\r\n]+/im;

This is assuming that Content-Disposition precedes Content-Type. The order of headers does not matter, afaik.

@@ +258,5 @@
> +    }
> +    let text = aText.substring(0, index);
> +    let lines = this.splitLines(text);
> +    for (let i = 0; i < lines.length; i++) {
> +      let header = lines[i].split(":");

If I am not mistaken, this is easy to break. This code expects headers are of the following form:

header1: value1
header2: value2
...

which is true in the common cases, but not all cases. You can get:

header1: value1
header1: continued value1
 continued value1
...
header2: value2
header3: some:foo

Notice what I did there:

- header names can be repeated multiple times. You can get the same name for some header names, and the values get concatenated together, comma separated. The code we have here doesnt break in this case because the array of headers it builds will include multiple items with the same header name, but different values, which is fine.
- header values can span multiple lines, as long as each line starts with a space or a tab.
- header values have no escaping requirements, so you can have ":" in the value. The above split(":") will break for Referer, Location or any header with any kind of IRI/URN, etc.

Recommended reading: http://www.w3.org/Protocols/rfc2616/rfc2616.html

@@ +284,5 @@
> +  splitLines: function(aText) {
> +    if (!aText) {
> +      return [];
> +    }
> +    let reSplitLines = /.*(:?\r\n|\n|\r)?/mg;

This regex matches more than needed. Why not simply /\r\n|\r|\n/? The m and g modifiers should not be need.

@@ +286,5 @@
> +      return [];
> +    }
> +    let reSplitLines = /.*(:?\r\n|\n|\r)?/mg;
> +    let lines;
> +    if (aText.match) {

Is aText expected to not be a string?

@@ +358,5 @@
> +   * @return boolean
> +   *         True if operating system is Windows, else false
> +   *         https://developer.mozilla.org/en-US/docs/Code_snippets/Miscellaneous#Operating_system_detection               
> +   */
> +  isWin: function() {

You can replace this method with Services.appinfo.OS == "WINNT".

(make sure you import Services.jsm, but try to avoid OS-specific behavior)
Attachment #8362459 - Flags: feedback?(mihai.sucan) → feedback+
Hi
Thank you for the reviews. I will look closer at the latest comments next.

> Sorry for the licensing issues taking so long, but it's better to be careful than sorry; hope this didn't prove to be too boring :)
I understand. Among things we are talking about the workplace for a lot of people and the browser.

> Let me know how I can help you with tests
For browser tests, is there a simple sleep-like function?

Eg. I have based my first try on the netmonitor/test/browser_net_copy_url.js but it seems like the request headers for the selected item is not added when the waitForNetworkEvents promise is completed.
Right now I have worked around this using an extra setTimeout call, but I am not sure if something is wrong here

Sample:

    ...
    waitForNetworkEvents(aMonitor, 1).then(() => {
      // requestItem is ready, but item.attachment.requestHeaders.headers is not added at this point.
      setTimeout(() => {
        let requestItem = RequestsMenu.getItemAtIndex(0);
        RequestsMenu.selectedItem = requestItem;

        replaceUserAgent(requestItem.attachment.requestHeaders.headers);
        ...
        let expectedValue = "curl '" + SIMPLE_URL + "' -H 'Host: example.com' -H 'User-Agent: Test User Agent' -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8' -H 'Accept-Language: en-US,en;q=0.5' -H 'Accept-Encoding: gzip, deflate' -H 'Referer: http://example.com/browser/browser/devtools/netmonitor/test/html_custom-get-page.html' -H 'Connection: keep-alive' --compressed";

        waitForClipboard(expectedValue, function setup() {
        ...

      }, 1000);


See also,
https://hg.mozilla.org/mozilla-central/file/cdc0ab2c0cba/browser/devtools/netmonitor/test/browser_net_copy_url.js

The idea is to start out with a browser test that reads the clipboard data for each request method (get, post, put etc.)
I guess I could try to add some reasonable xpcshell type tests for Curl.jsm too.
As Mihai said, today other tools does not add the include / -i argument to the command.
Adding the argument to the generated command makes more sense to me. +1

I understand it as a curl behavior. Makes sense somehow as a head request does not sends the document body

http://stackoverflow.com/a/19601923/749276
http://www.commandlinefu.com/commands/view/1685/send-an-http-head-request-wcurl
I think -I is the arg that should be used. It gives you back the prompt.


http://curl.haxx.se/docs/manpage.html
> Do we need to always add --compressed? Why? Would an option be suitable here?

Maybe :) I am a bit biased as a web developer where my perspective is to always inspect the document body text.
On another side it does not give some users the expected response.
Chrome devtools always adds this argument. Firebug uses an option (default is false).

Maybe someone else has an opinion about this.

It is no problem for to base it on an option.
Fetching a compressed resource results in a faster download, so in that sense it's a nice thing to have. Also, since the whole purpose of this feature is to get a string in the clipboard to paste in my terminal, I don't see much point in having prefs to customize the way this string is created. I can always just edit the command line before hitting ENTER. Conversely, if it turns out that requesting a compressed resource can't be handled gracefully by some servers and results in errors for an otherwise fine request, we should just omit it completely and let the user type it manually in the command line.

I love curl so I thought I'd add my 2 cents. Also, +1 on -I, I used it all the time in a previous life.
@@ +130,5 @@
> +    if (!(requestPostData && typeof requestPostData.postData.text == "string")) {
> +      return false;
> +    }
> +    let postText = requestPostData.postData.text.toLowerCase();
> +    if (postText.startsWith("content-type: application/x-www-form-urlencoded")) {

>Does it have to start with that? Wouldn't it be better to simply check for .contains "x-www-form-urlencoded", or maybe even regex? That's what we do in netmonitor-view.js for _setRequestPostParams().

As long as strings consequently starts with the string I think it is more explicit. I do not want to be a fanatic here ;) so if the code already uses contains with reasons I will use that. 
As the strings are fixed I think String.prototype.contains() would be sufficient. Informal performance test, http://jsperf.com/substring-test/22
@@ +284,5 @@
> +  splitLines: function(aText) {
> +    if (!aText) {
> +      return [];
> +    }
> +    let reSplitLines = /.*(:?\r\n|\n|\r)?/mg;

> This regex matches more than needed. Why not simply /\r\n|\r|\n/? The m and g modifiers should not be need.

Not sure if I misunderstand something or maybe the function name is a bit vague?
The text argument is/can be multiline text so both flags are needed I think.
/\r\n|\r|\n/ would just match the newline/carriage returns which is not the intention. 
It should return all lines in aText as an array.

Fiddle: http://jsfiddle.net/EHaa2/ Run and check the console


@@ +286,5 @@
> +      return [];
> +    }
> +    let reSplitLines = /.*(:?\r\n|\n|\r)?/mg;
> +    let lines;
> +    if (aText.match) {

> Is aText expected to not be a string?

My bad, should have spotted this. It is from a general string helper in Firebug
> I don't see much point in having prefs to customize the way this string is created. I can always just edit the command line before hitting ENTER

Agree!
> This regex matches more than needed. Why not simply /\r\n|\r|\n/? The m and g modifiers should not be need.

Ah, when thinking about it we could use that regex and just just split the text. Much simpler ;)

aText.split(/(\r\n|\r|\n)/);
*without the capturing parentheses
(In reply to Thomas Andersen from comment #44)
> > This regex matches more than needed. Why not simply /\r\n|\r|\n/? The m and g modifiers should not be need.
> 
> Ah, when thinking about it we could use that regex and just just split the
> text. Much simpler ;)
> 
> aText.split(/(\r\n|\r|\n)/);

Exactly what I wanted to reply. Much simpler to use that regex+split().
(In reply to Thomas Andersen from comment #41)
> @@ +130,5 @@
> > +    if (!(requestPostData && typeof requestPostData.postData.text == "string")) {
> > +      return false;
> > +    }
> > +    let postText = requestPostData.postData.text.toLowerCase();
> > +    if (postText.startsWith("content-type: application/x-www-form-urlencoded")) {
> 
> >Does it have to start with that? Wouldn't it be better to simply check for .contains "x-www-form-urlencoded", or maybe even regex? That's what we do in netmonitor-view.js for _setRequestPostParams().
> 
> As long as strings consequently starts with the string I think it is more
> explicit. I do not want to be a fanatic here ;) so if the code already uses
> contains with reasons I will use that. 
> As the strings are fixed I think String.prototype.contains() would be
> sufficient. Informal performance test, http://jsperf.com/substring-test/22

IIANM the request to use contains() is about correctness. Can we always expect that the postText starts with content-type...? The network panel actually uses a regex that allows any amount of whitespaces between header name and value, but it fails to ignore the case ...
(In reply to Mihai Sucan [:msucan] from comment #35)
> Comment on attachment 8362459 [details] [diff] [review]
> 
> Victor: do you have something for multipart post data or is that displayed
> raw?

The raw data is displayed in an editor at the moment. Pretty awful, but there's a bug on file to make that better. Nothing reusable here as far as I know.

(In reply to Thomas Andersen from comment #36)
> > Let me know how I can help you with tests
> For browser tests, is there a simple sleep-like function?
> 

Sleep functions and similar should be ABSOLUTELY avoided in tests, because they introduce timing-dependent behavior which results in intermittent failures.

Rule 1: There should never be a reason for using setTimeout, setInterval or the sorts in tests.
Rule 2: If there seems like there is a reason, refer to Rule 1.
Rule 3: If you managed to break out of the Rule 1/Rule 2 loop and ended at Rule 3, it means you have a quantum computer, which means you're a billionaire, so you probably don't really want to worry about intermittent timing-dependent test failures.

Two things you can do:

1. RequestsMenu.lazyUpdate = false;, so that everything is populated synchronously so you most likely don't need to worry about this stuff anymore in tests.

2. Try listening to some events, they're defined in netmonitor-controller.js, at the top. For example, SIDEBAR_POPULATED or NETWORKDETAILSVIEW_POPULATED might be useful, but I don't think you need them if you set RequestsMenu.lazyUpdate = false.
Thanks Victor! I will probably jump out of the whole cycle ;) I was just quickly trying out some things and will get back to the test code after the implementation has been improved.
Attached patch bug859059-rev5.patch (obsolete) — Splinter Review
Hi

Sorry for the delay :) Here is a preview patch which addresses most of Mihai's and Victor's previous comments.
The reason for submiting this patch to say hello and get some early feedback ;-)

@Victor
Could you please check the usage of getString() in netmonitor-view.js - copyAsCurl()
@Mihai
Could you please review the Curl.jsm - getHeadersFromMultipartText() especially the regex for getting a header line from the request payload headers section 

Regarding contains vs. regexp I ended up with toLowerCase().contains(str) for readability reasons.
Let me know if we should try to optimize this further.

I also introduced the -I (HEAD) flag and removed the --compressed argument to the genrated string in this patch.
I guess these will be discussed further in the final stages of this task.

Note that issues regarding removeBinaryDataFromMultipartPostText() is missing in this patch.

Quick overview:

* [done] add -I (HEAD) to genereated string for HEAD requests.
* [done] remove --compressed from generated string.
* [done] refactor utility functions to JS privates.
* [done] create a more sanitized source data object for the generateCommand function.
  * [done] describe the in object comment.
* [done] use Set instead of Array for ignoredHeaders.
* [done] replace isWin() with Services.appinfo.OS
* [done] consequently use contains/regex instead of String startsWith()
* [done] robustify getHeadersFromMultipartText (support mulitline values. see comment #35)
* [done] remove empty access key attribute in xul menu item.
* [done] consequently add colon (.) after sentences in comments.
* [done] remove all trailing spaces.
* [done] fix funky comment.

Also, there is a known issue now where the menu/menu item is available during the request (see mulitpart/form reqeust with large files)
When this happens the request payload string is not available, resulting in an incomplete cURL string.
Maybe we should file a new issue for this.

Quick test page:
http://mr-andersen.no/curl/

Awaiting further feedback,

t.
Attachment #8371428 - Flags: feedback?(vporof)
Attachment #8371428 - Flags: feedback?(mihai.sucan)
Comment on attachment 8371428 [details] [diff] [review]
bug859059-rev5.patch

Review of attachment 8371428 [details] [diff] [review]:
-----------------------------------------------------------------

Looks better. Thanks for the update Thomas!

The patch still has trailing whitespace.

More comments below, but this seems to be almost ready.

::: browser/devtools/shared/Curl.jsm
@@ +234,5 @@
> +    // The second line in a part should be the content disposition form-data.
> +    if (/^Content-Disposition: form-data/im.test(contentDispositionLine)) {
> +      // Remove binary data from file.
> +      if (/filename=/im.test(contentDispositionLine)) {
> +        let headersRe = /[\r\n]+Content-Disposition.+$[\r\n]+Content-Type.+$[\r\n]+/im;

This part still needs to be updated to address comment 35.

@@ +276,5 @@
> +  // Eg.
> +  //    X-Header: line1
> +  //     line2
> +  //     line3
> +  let headerLines = headersText.match(/^\b.+((\n\s.+)+|$)/gm);

Instead of this regex, you could do headersText.split(/\r\n|\n/), so you get all the lines. Go through each line and, like you do now, check for the colon. Store the header name and value, like now. However, also check if the line starts with \s (whitespace), if yes, don't do the colon indexOf, just add to the |headers| array the previous header name, with the value = the whole current line.trim().

let lastHeaderName = null;
for (let line of headerLines) {
  if (lastHeaderName && /^\s+/.test(line)) {
    headers.push({ name: lastHeaderName, value: line.trim() });
    continue;
  }

  let indexOfColon = line.indexOf(":");
  let header = [line.slice(0, indexOfColon), line.slice(indexOfColon + 1)];
  if (header.length != 2) {
    continue;
  }
  lastHeaderName = header[0].trim();
  headers.push({ name: lastHeaderName, value: header[1].trim() });
}

@@ +278,5 @@
> +  //     line2
> +  //     line3
> +  let headerLines = headersText.match(/^\b.+((\n\s.+)+|$)/gm);
> +  for (let i = 0; i < headerLines.length; i++) {
> +    let line = headerLines[i]

nit: please end lines with ;
Attachment #8371428 - Flags: feedback?(mihai.sucan) → feedback+
Comment on attachment 8371428 [details] [diff] [review]
bug859059-rev5.patch

Review of attachment 8371428 [details] [diff] [review]:
-----------------------------------------------------------------

This looks awesome and it's progressing nicely! Thank you!

::: browser/devtools/netmonitor/netmonitor-view.js
@@ +491,5 @@
>  
>    /**
> +   * Copy a cURL command from the currently selected item.
> +   */
> +  copyAsCurl: function() {

You'll want to hide this menu item when there are no requests available. See _onContextShowing in this file: http://dxr.mozilla.org/mozilla-central/source/browser/devtools/netmonitor/netmonitor-view.js#1378

@@ +494,5 @@
> +   */
> +  copyAsCurl: function() {
> +    let selected = this.selectedItem.attachment;
> +
> +    // Create a sanitized object for the Curl generator.

Nit: "for the Curl command generator" sounds more explicit to me.

::: browser/devtools/netmonitor/netmonitor.xul
@@ +29,5 @@
>                  label="&netmonitorUI.context.copyUrl;"
>                  accesskey="&netmonitorUI.context.copyUrl.accesskey;"/>
> +      <menuitem id="request-menu-context-copy-as-curl"
> +                label="&netmonitorUI.context.copyAsCurl;"
> +                oncommand="NetMonitorView.RequestsMenu.copyAsCurl();"/>                

Nit: there's trailing whitespace here.

::: browser/devtools/shared/Curl.jsm
@@ +23,5 @@
> + *     its contributors may be used to endorse or promote products derived
> + *     from this software without specific prior written permission.
> + *
> + * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
> + * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 

Nit: trailing whitespace.

@@ +55,5 @@
> +   *          - headers:array, an array of request headers {name:x, value:x} tuples.
> +   *          - httpVersion:string, http protocol version rfc2616 formatted. Eg. "HTTP/1.1"
> +   *          - postDataText:string, optional - the request payload.
> +   * 
> +   * @return string

Nit: trailing whitespace.

@@ +62,5 @@
> +  generateCommand: function(aData) {
> +    let command = ["curl"];
> +    let ignoredHeaders = new Set();
> +
> +    // The cURL is command expected to run on the same platform that Firefox run

s/Firefox run/Firefox runs/

@@ +80,5 @@
> +      postDataText = aData.postDataText;
> +      data.push("--data");
> +      data.push(escapeString(writePostDataTextParams(postDataText)));
> +      ignoredHeaders.add("Content-Length");
> +      inferredMethod = "POST";

I might not be as familiar as you with curl, but please let me know why we're going to use POST when the request's method was PUT. Better yet, why not always use the vanilla method verb from the request?

@@ +143,5 @@
> +  }
> +
> +  postDataText = postDataText.toLowerCase();
> +
> +  if (postDataText.contains("content-type: application/x-www-form-urlencoded")) {

Do we need .toLowerCase() this here?

@@ +149,5 @@
> +  }
> +
> +  let contentType = findHeader(aData.headers, "content-type");
> +
> +  return (contentType && 

Nit: trailing whitespace.

@@ +150,5 @@
> +
> +  let contentType = findHeader(aData.headers, "content-type");
> +
> +  return (contentType && 
> +    contentType.contains("application/x-www-form-urlencoded"));

Ditto, do we need .toLowerCase() this here? Same question for everywhere .contains() is used.

@@ +208,5 @@
> +  for (let i = 0; i < aHeaders.length; i++) {
> +    let header = aHeaders[i];
> +    let headerName = header.name.toLowerCase();
> +    if (headerName == name) {
> +      return header.value;

All the header values are long strings. You should fetch them in netmonitor-view.js before sending them to Curl.jsm.

::: browser/locales/en-US/chrome/browser/devtools/netmonitor.dtd
@@ +199,5 @@
>  <!ENTITY netmonitorUI.context.copyUrl     "Copy URL">
>  
> +<!-- LOCALIZATION NOTE (netmonitorUI.context.copyAsCurl): This is the label displayed
> +  -  on the context menu that copies the selected request as a cURL command.
> +  -  The capitalization is part of the official name and should be used throughout all languages. 

Nit: trailing whitespace.
Attachment #8371428 - Flags: feedback?(vporof) → feedback+
Thanks for the reviews. I will look at this next.
Sorry for the repeating white space issue. I am trying to learn Sublime Text while doing these tasks :)
Just found these ST settings that may help,

"draw_white_space": "all",
"trim_trailing_white_space_on_save": true

@Victor and @Panos

> +      postDataText = aData.postDataText;
> +      data.push("--data");
> +      data.push(escapeString(writePostDataTextParams(postDataText)));
> +      ignoredHeaders.add("Content-Length");
> +      inferredMethod = "POST";

> I might not be as familiar as you with curl, but please let me know why we're going to use POST when the request's method was PUT. Better yet, why not always use the vanilla method verb from the request?

Yes, I guess we can always use the method verb from the data object.
The reason may be a personal preference and is of-course up for discussion.

-X / --request is not needed for GET and POST requests as cURL will invoke the command based on the given arguemnts.
Eg. When --data and --binary is added to the command, cURL invokes the command as a POST request if -X is not specified.
GET is default. See the if-block right after mentioned code. This should resolve the proper verb if needed.
Also, see the man page for -X/--request for more information.
Maybe it is a bit arcane and the code should be documented or we should just always add the -X arg.

Maybe Panos have some thoughts about this?

Chrome does it this way.

@Victor

> +  for (let i = 0; i < aHeaders.length; i++) {
> +    let header = aHeaders[i];
> +    let headerName = header.name.toLowerCase();
> +    if (headerName == name) {
> +      return header.value;

> All the header values are long strings. You should fetch them in netmonitor-view.js before sending them to Curl.jsm.

Maybe I misunderstand something, but I can not see that request header values are long strings. Any tips?

t.
Flags: needinfo?(vporof)
Flags: needinfo?(past)
I know that converting PUT requests to POST would return completely different results in RESTful web apps I've built in the past. I think this feature should try to reproduce the exact same request as much as possible. Converting PUT to POST is done in cases where clients can't issue PUT requests, which clearly isn't the case with cURL, or when the server does not distinguish between the two, which shouldn't be our concern.

About trailing whitespace, I've found that setting such prefs isn't a good idea, as it might modify unrelated parts of the file, which will annoy your reviewer ;-) What most people do instead, is add a mercurial commit hook to highlight trailing whitespace, like this:

[hooks]
commit.whitespace = hg export tip | (! grep -C 3 --color=always -n '^+.*[ ]$')
Flags: needinfo?(past)
(In reply to Thomas Andersen from comment #53)
> 
> > All the header values are long strings. You should fetch them in netmonitor-view.js before sending them to Curl.jsm.
> 
> Maybe I misunderstand something, but I can not see that request header
> values are long strings. Any tips?
> 

It's a gamble. I guess what I meant was that all header values *should be assumed* to be long strings. The web console actors send out plain strings when the header values are short, but long strings otherwise, so the safest approach is to always assume they're long and try fetching the full content beforehand. The getString method in the controller accounts for these situations.
Flags: needinfo?(vporof)
Again, sorry for the delay, got to pay the bills too :) 

Before I submit the new patch revision can you please take a look at this Victor!?

Regarding long stringing the request header values. 
I got a bit unsure about the promises and I not sure if it looks good.

// netmonitor-view.js
copyAsCurl: function() {
  let selected = this.selectedItem.attachment;

  // Create a sanitized object for the Curl command generator.
  let data = {
    url: selected.url,
    method: selected.method,
    headers: selected.requestHeaders.headers,
    httpVersion: selected.httpVersion,
    postDataText: null
  };

  function longStringHeaderValuesRecursive(i, requestHeaders) {
    if (i < requestHeaders.length) {
      let headerValuePromise = gNetwork.getString(requestHeaders[i].value).then(aString => {
        requestHeaders[i].value = aString;
        i++;
        longStringHeaderValuesRecursive(i, requestHeaders);
      });
    }
  }

  // Get the long string for post data text.
  let postDataPromise = null;
  if (selected.requestPostData) {
    let postDataText = selected.requestPostData.postData.text;
    postDataPromise = gNetwork.getString(postDataText).then(aString => {
      data.postDataText = aString;
    });
  } else {
    postDataPromise = promise.resolve();
  }

  postDataPromise.then(() => {
    longStringHeaderValuesRecursive(0, selected.requestHeaders.headers);
    clipboardHelper.copyString(Curl.generateCommand(data), document);
  });
},
Flags: needinfo?(vporof)
That's one way of doing it, but it's super fugly, and we have a better way. Promises and Tasks [0] [1] are the future :)

Task.spawn(function*() {
  let selected = this.selectedItem.attachment;

  // Create a sanitized object for the Curl command generator.
  let data = {
    url: selected.url,
    method: selected.method,
    headers: [],
    httpVersion: selected.httpVersion,
    postDataText: null
  };

  for (let { name, value } of selected.requestHeaders.headers) {
    let text = yield gNetwork.getString(value);
    data.headers.push({ name: name, value: text });
  }

  if (selected.requestPostData) {
    let postData = selected.requestPostData.postData.text;
    data.postDataText = yield gNetwork.getString(postData);
  }

  clipboardHelper.copyString(Curl.generateCommand(data), document);
});

[0] https://developer.mozilla.org/en-US/docs/Mozilla/JavaScript_code_modules/Task.jsm
[1] http://taskjs.org/
Flags: needinfo?(vporof)
Wow! That is more like beautiful. I was looking into of for-of and promises, but the missing part here was Task. Thanks!
Attached patch bug859059-rev6.patch (obsolete) — Splinter Review
Hi

Here is a new patch based on previous comments.

Details

* Make sure the request header values are fetched from the server before passing it to the curl command generator.
* Skip lower casing the strings before searching etc. (the casing should be predictable).
* Fix mentioned issues in removeBinaryFromPostDataText (ref. comment #35).
* Use code instead of regex to parse the headers from post data text (comment #51).
* Hide the menu item during a request.
* Typos, missing semicolon, and trailing spaces.

@Mihai:
> The boundary string should be in content-type, as a parameter ;boundary=foobar. Maybe I'm mistaken, please double check. If possible, then it should be safer to get the boundary from the header value.

As far as I can see the content-type does not exist in the request headers array for multipart requests. It is only found in the post text.
We could of-course refactor this a bit etc., but I left it as it is for now. Let me know and I can fix it.

@Victor: 
> I might not be as familiar as you with curl, but please let me know why we're going to use POST when the request's method was PUT. Better yet, why not always use the vanilla method verb from the request?

I tried to cleaned this up a bit to make the intention clearer.
As mentioned in comment #53 we could skip the logic all together and always add the method verb. 
Right now the generated command should be the same as in other tools.

t.
Attachment #8362459 - Attachment is obsolete: true
Attachment #8371428 - Attachment is obsolete: true
Attachment #8382127 - Flags: review?(vporof)
Attachment #8382127 - Flags: review?(mihai.sucan)
Comment on attachment 8382127 [details] [diff] [review]
bug859059-rev6.patch

Review of attachment 8382127 [details] [diff] [review]:
-----------------------------------------------------------------

This looks great! I only have a few very small comments, see below. We need a few tests for the new functionality. Let me know if there's any help you need with that, I'll be glad to assist! Can't wait to get this landed :D

::: browser/devtools/shared/Curl.jsm
@@ +64,5 @@
> +    let ignoredHeaders = new Set();
> +
> +    // The cURL command is expected to run on the same platform that Firefox runs
> +    // (it may be different from the inspected page platform).
> +    let escapeString = Services.appinfo.OS == "WINNT" ? 

Nit: trailing whitespace after the ?

@@ +143,5 @@
> +  if (!postDataText) {
> +    return false;
> +  }
> +
> +  if (postDataText.contains("Content-Type: application/x-www-form-urlencoded")) {

AFAIK header names are case insensitive, so "content-type" could work just as well.
Are we sure we shouldn't toLowerCase this here? (just checking)

@@ +167,5 @@
> +  if (!postDataText) {
> +    return false;
> +  }
> +
> +  if (postDataText.contains("Content-Type: multipart/form-data")) {

Ditto.

@@ +185,5 @@
> + */
> +function writePostDataTextParams(aPostDataText) {
> +  let lines = aPostDataText.split("\r\n");
> +
> +  return lines[lines.length - 1];

Uber nit: remove the newline before `return`.

@@ +203,5 @@
> +  if (!aHeaders) {
> +    return null;
> +  }
> +  let name = aName.toLowerCase();
> +  for (let i = 0; i < aHeaders.length; i++) {

Nit: you can use for..of here since we don't need the incrementor.

for (let header of aHeaders) {
  if (name == header.name) {
    return header.value;
  }
}

@@ +227,5 @@
> +  let result = "";
> +  let boundaryRe = /^--.+/gm;
> +  let boundaryString = boundaryRe.exec(aMultipartText)[0];
> +  let parts = aMultipartText.split(boundaryRe);
> +  parts.forEach((part) => {

There's no need for a creating a function here just use a for..of loop.

for (let part of parts) {
  ...
}

@@ +230,5 @@
> +  let parts = aMultipartText.split(boundaryRe);
> +  parts.forEach((part) => {
> +    // The second line in a field part should be the content disposition form-data.
> +    let contentDispositionLine = part.split("\r\n")[1];
> +    if (contentDispositionLine.contains("Content-Disposition: form-data")) {

Again, I'm pretty sure header names are case-insensitive.

@@ +266,5 @@
> +  let index = aMultipartText.indexOf("\r\n\r\n");
> +  if (index == -1) {
> +    return headers;
> +  }
> +  let headersText = aMultipartText.substring(0, index);

Nit: group this assignment and the ones below under the "// Parse the header lines" comment, since this variable is only used for that.

@@ +270,5 @@
> +  let headersText = aMultipartText.substring(0, index);
> +
> +  // Parse the header lines.
> +  let headerLines = headersText.split("\r\n");
> +  let lastHeaderName = null;

Nit: please add a newline before the for loop to make this slightly easier to read :)
Attachment #8382127 - Flags: review?(vporof) → feedback+
Attached patch bug859059-rev7.patch (obsolete) — Splinter Review
Thank you for the quick reply Victor!
This patch should address the comments.

Regarding matching headers in the code,
According to the RFC the field name is case-insensitive, I guess it would go for field values too. 
I am reverting back to use the String.toLowerCase / String.contains combo.
It's up for discussion if we should use regex instead or a reusable contains like function. 
By guessing I think regex is faster here.

> Let me know if there's any help you need with that, I'll be glad to assist! Can't wait to get this landed :D
Yes, I agree! :)

t.
Attachment #8382324 - Flags: review?(vporof)
Attachment #8382324 - Flags: review?(mihai.sucan)
Attachment #8382324 - Flags: review?(vporof) → feedback+
Attachment #8382127 - Attachment is obsolete: true
Attachment #8382127 - Flags: review?(mihai.sucan)
Regarding tests, 
Maybe we should test it the same way as "Copy URL" and "Copy Image as Data URI" to make sure the menu item works.

It would be nice to unit test some of the functions in the .jsm
I am not sure if there is some practice regarding this as all functions are private except for the generateCommand method. Is it common to export these for testing purposes only? 

Or, do you have something else in mind?
Flags: needinfo?(vporof)
Yes, definitely, in principle the test should behave and look more or less like the other context menu entries tests. 

Generally, private functions don't need to be tested, but I don't feel strongly on either approach in this particular case. You could create a CurlUtils objects on which all the helper methods stay on, export that along with the Curl object, and test each function. This would certainly be very helpful to have good coverage on Curl.jsm, but I won't block landing if this proves to be too tedious.
Flags: needinfo?(vporof)
Attached patch 859059-test-menu-item.patch (obsolete) — Splinter Review
Ok, here is a browser test for testing the menu item.

As you see the test is modifying a couple of the more dynamic request headers in order to work on different setups. I am not that familiar with testing procedures at Mozilla so please guide me here :)

Thomas
Attachment #8382981 - Flags: review?(vporof)
And yes, I would like to refactor the helper functions to a CurlUtil object as suggested and try to write some unit tests for it.

I also have a note about reminding Mihai about potential redundancies etc. in the Curl code too (ref. comment #35).
Waiting for his comments on patch #7
(In reply to Thomas Andersen from comment #59)
> Created attachment 8382127 [details] [diff] [review]
> bug859059-rev6.patch
> 
> @Mihai:
> > The boundary string should be in content-type, as a parameter ;boundary=foobar. Maybe I'm mistaken, please double check. If possible, then it should be safer to get the boundary from the header value.
> 
> As far as I can see the content-type does not exist in the request headers
> array for multipart requests. It is only found in the post text.
> We could of-course refactor this a bit etc., but I left it as it is for now.
> Let me know and I can fix it.

I believe this is happening because we get the parsed headers, not the raw headers from the server. We do have access to them, at lower levels, we just havent exposed them through the remote debugging protocol (yet).

Here you cant do much, aside from hacking into the NetworkMonitor code and the network event actor. This kind of work is beyond the purpose of this bug. See bug 859133.
Comment on attachment 8382324 [details] [diff] [review]
bug859059-rev7.patch

Review of attachment 8382324 [details] [diff] [review]:
-----------------------------------------------------------------

Patch looks good and I'm enjoying the improvements. Thanks for the updates! I have no additional comments beyond what Victor has already covered.

I'm slightly unhappy about the multipart binary data removal code, which cannot use the boundary string from the content-type header, but that is something we need to fix after bug 859133. Thomas, can you please open a follow-up bug report about this issue? Make it depend on the aforementioned bug, and this one here. Add a TODO comment in the function and mention the new bug number. Thanks!
Attachment #8382324 - Flags: review?(mihai.sucan) → feedback+
Comment on attachment 8382981 [details] [diff] [review]
859059-test-menu-item.patch

Review of attachment 8382981 [details] [diff] [review]:
-----------------------------------------------------------------

::: browser/devtools/netmonitor/test/browser_net_copy_as_curl.js
@@ +8,5 @@
> +function test() {
> +  initNetMonitor(CUSTOM_GET_URL).then(([aTab, aDebuggee, aMonitor]) => {
> +    info("Starting test... ");
> +
> +    function setFixedRequestHeaders(requestHeaders) {

You should just create a new sjs server file for this test and add some custom headers there directly, instead of mucking about with the ones already fetched.

You could read more about it here: https://developer.mozilla.org/en-US/docs/Httpd.js/HTTP_server_for_unit_tests#SJS:_server-side_scripts
Attachment #8382981 - Flags: review?(vporof) → feedback+
> You should just create a new sjs server file for this test and add some custom headers there directly, instead of mucking about with the ones already fetched.

I have looked at it, but I still can not see an official way to set request headers in SJS's.
Also XHR can not modify the user-agent header.
Am I missing something here?
Do we really need to modify the user-agent header? If not, how about creating an html file that adds custom headers to the XHR?
> Do we really need to modify the user-agent header?

This is the ua in the expected result string.
If the expected result string has a fixed user-agent string (eg. "Macintosh....") I guess the comparison will fail when the client is using Window/Linux.

https://dvcs.w3.org/hg/xhr/raw-file/tip/Overview.html#dom-xmlhttprequest-setrequestheader
https://bugzilla.mozilla.org/show_bug.cgi?id=627942

maybe it is possible to get it from the client's .navigator object though?
(In reply to Thomas Andersen from comment #71)
> > Do we really need to modify the user-agent header?
> 
> maybe it is possible to get it from the client's .navigator object though?

Yeah! navigator.userAgent should work in tests.
Attached patch 859059-test-menu-item.patch (obsolete) — Splinter Review
ok, new try for the menu item test.
Attachment #8382981 - Attachment is obsolete: true
Attachment #8383703 - Flags: review?(vporof)
Blocks: 978144
> Thomas, can you please open a follow-up bug report about this issue?
Done. bug 978144
Flags: needinfo?(mihai.sucan)
Attached patch bug859059-rev8.patch (obsolete) — Splinter Review
Refactored the helpers to a CurlUtils object.
Attachment #8382324 - Attachment is obsolete: true
Attachment #8383839 - Flags: review?(vporof)
Comment on attachment 8383839 [details] [diff] [review]
bug859059-rev8.patch

Review of attachment 8383839 [details] [diff] [review]:
-----------------------------------------------------------------

::: browser/devtools/shared/Curl.jsm
@@ +353,5 @@
> +                     .replace(/%/g, "\"%\"")
> +                     .replace(/\\/g, "\\\\")
> +                     .replace(/[\r\n]+/g, "\"^$&\"") + "\"";
> +  }
> +

Nit: redundant newline.
Attachment #8383839 - Flags: review?(vporof) → review+
Comment on attachment 8383703 [details] [diff] [review]
859059-test-menu-item.patch

Review of attachment 8383703 [details] [diff] [review]:
-----------------------------------------------------------------

This is beautiful! A few small comments, and this should be ready to land.

Are you still working on adding tests for the CurlUtils?

::: browser/devtools/netmonitor/test/browser_net_copy_as_curl.js
@@ +8,5 @@
> +function test() {
> +  initNetMonitor(CURL_URL).then(([aTab, aDebuggee, aMonitor]) => {
> +    info("Starting test... ");
> +
> +    const EXPECTED_RESULT = "curl '" + SIMPLE_SJS + "' -H 'Host: example.com' -H 'User-Agent: " + aDebuggee.navigator.userAgent + "' -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8' -H 'Accept-Language: " + aDebuggee.navigator.language + "' -H 'Accept-Encoding: gzip, deflate' -H 'Referer: " + CURL_URL + "' -H 'Connection: keep-alive'";

Would it be possible to format this in a way that's easier to read? Something like this:

const EXPECTED_RESULT = [
  "curl",
  "'" + SIMPLE_SJS + "'",
  "-H 'Host: example.com'",
  "-H 'User-Agent: " + aDebuggee.navigator.userAgent + "'",
  ...
].join(" ");

That's better, eh?

::: browser/devtools/netmonitor/test/html_copy-as-curl.html
@@ +14,5 @@
> +    <script type="text/javascript">
> +      function performRequest(aUrl) {
> +        var xhr = new XMLHttpRequest();
> +        xhr.open("GET", aUrl, true);
> +        xhr.setRequestHeader("Accept-Language", window.navigator.language);

You could even go full berserk and add a setRequestHeader("XXX-Custom-Header", "Hello");, to make sure custom headers are properly added.
Attachment #8383703 - Flags: review?(vporof) → review+
Attachment #786198 - Attachment is obsolete: true
Attached patch bug859059-rev9.patch (obsolete) — Splinter Review
Fixed redundant newline.
Attachment #8383839 - Attachment is obsolete: true
Attachment #8384786 - Flags: review?(vporof)
Attachment #8384786 - Flags: review?(vporof) → review+
Attached patch 859059-test-menu-item.patch (obsolete) — Splinter Review
Hi again,

Here is new menu item test.

* Improved the readability of the expected result.
* Added some custom request headers to the web page's Ajax request.

I guess the escaping functions in CurlUtils could be covered a bit with the custom headers, but since the escaping functions are OS dependent I am not sure if it is a good idea here. I mean these tests run on a lot of different set-ups through the day. Right?

Also, these tests would probably fail for agents who have the DNT (do-not-track) header turned "on".
DNT creates a DNT: 1 request header.

Maybe it is just too convoluted to browser test the generated string? :)

Thomas
Attachment #8383703 - Attachment is obsolete: true
Attachment #8384875 - Flags: review?(vporof)
> Are you still working on adding tests for the CurlUtils?
Yes, meaning I have not started to do any actual coding yet ;)
Comment on attachment 8384875 [details] [diff] [review]
859059-test-menu-item.patch

Review of attachment 8384875 [details] [diff] [review]:
-----------------------------------------------------------------

Eggcellent! :) (I'm eating an egg sandwich right now, pardon my lack of focus).

Once testing the relevant CurlUtil methods is finished, let's land this!
Such excite. Much useful! Very developers. Wow.
Attachment #8384875 - Flags: review?(vporof) → review+
(In reply to Thomas Andersen from comment #74)
> > Thomas, can you please open a follow-up bug report about this issue?
> Done. bug 978144

Thanks.
Flags: needinfo?(mihai.sucan)
(In reply to Victor Porof [:vporof][:vp] from comment #81)

> Once testing the relevant CurlUtil methods is finished, let's land this!

Quick question, is it better to use mochitest-browser test for this? 
I mean, since we can easily create web - ajax requests and fetch the data from the Network panel for the tests. Think mulitpart form/data request payload strings.

thomas
Flags: needinfo?(vporof)
(In reply to Thomas Andersen from comment #83)
> (In reply to Victor Porof [:vporof][:vp] from comment #81)
> 
> > Once testing the relevant CurlUtil methods is finished, let's land this!
> 
> Quick question, is it better to use mochitest-browser test for this? 

That sounds good!
Flags: needinfo?(vporof)
Attached patch bug859059-rev10.patch (obsolete) — Splinter Review
Hi again,

Here is a new patch that should fix some issues that have been discovered.
Hopefully I have understood Mihai explanations about this and I hope he is not too mad about the mess ;)

* Refactored CurlUtils.removeBinaryDataFromMultipartText().
  Moved the cruft to find the boundary string to a CurlUtils.getMultipartBoundary() function.

* Added CurlUtils.getMultipartBoundary() function.
  As discussed with Mihai the removeBinaryDataFromMultipartText() function should get the boundary string as a function argument.
  Since the boundary string is only available in the request payload at the moment (see 978144) this method implements a fallback to find it in the request payload.
  See also bug 978144 and bug 859133

* Fixed bug in CurlUtils.getHeadersFromMultipartText
  Made sure the function handles request payloads that does not have the header section (somehow multipart requests made with XHR and FormData does not add this).

Also, a CurlUtil test is beeing worked on, but as you know the issues above needed to be resolved first.

Thomas
Attachment #8384786 - Attachment is obsolete: true
Attachment #8389428 - Flags: review?(vporof)
Attachment #8389428 - Flags: review?(mihai.sucan)
Attached patch bug859059-rev11.patch (obsolete) — Splinter Review
Well, writing tests is always a good exercise.

This patch builds on the previous patch (#10) and fixes so that the boundary string for the parts recreated  in the binary removal function gets prepended with 2 dashes.

Eg.: The boundary the agent generates (4):

"----123"

Delimiter in the request payload (6):

"------123"

and should end with (4):

"----123--"

This is according to the rfc.
Attachment #8389428 - Attachment is obsolete: true
Attachment #8389428 - Flags: review?(vporof)
Attachment #8389428 - Flags: review?(mihai.sucan)
Attachment #8390173 - Flags: review?(vporof)
Attachment #8390173 - Flags: review?(mihai.sucan)
Comment on attachment 8390173 [details] [diff] [review]
bug859059-rev11.patch

Review of attachment 8390173 [details] [diff] [review]:
-----------------------------------------------------------------

LGTM. Thank you!

f+ for now, will give r+ once the patch has tests.

::: browser/devtools/shared/Curl.jsm
@@ +134,5 @@
> +  }
> +};
> +
> +/**
> + * Misc. utility functions for the Curl command generator.

nit: s/Misc. utility/Utility/

@@ +233,5 @@
> +   *        The data source. See the description in the Curl object.
> +   * @return string
> +   *         The boundary string for the request.
> +   */
> +  getMultipartBoundary: function(aData) {

This is good now. Thanks!

@@ +234,5 @@
> +   * @return string
> +   *         The boundary string for the request.
> +   */
> +  getMultipartBoundary: function(aData) {
> +    let boundaryRe = /boundary=(-{3,}\w+)/i;

Maybe add \b before boundary.
Attachment #8390173 - Flags: review?(mihai.sucan) → feedback+
Comment on attachment 8390173 [details] [diff] [review]
bug859059-rev11.patch

Review of attachment 8390173 [details] [diff] [review]:
-----------------------------------------------------------------

r+ with some testing of the CurlUtils methods.
Attachment #8390173 - Flags: review?(vporof) → review+
Attached patch bug859059-rev12.patch (obsolete) — Splinter Review
Addressing issues in comment #87
Thanks!
Attachment #8390173 - Attachment is obsolete: true
Attachment #8390804 - Flags: review?(mihai.sucan)
Hi again,

Here is a preview patch for the CurlUtils tests just to give you an idea of the direction. 

Hoping for some feedback on the direction.

As a preview it does not have 100% coverage and I hope to clean it up etc.
I guess there needs to be some OS conditionals for testing the string escaping functions in order to make the test work on non-Posix systems. I guess I can use the Services object for this if there is not a more desired practice for this.
See test for removeBinaryDataFromMultipartText()

If the direction etc. is ok I guess this test is ready soon 
(I am leaving the screen for the weekend and expect to be back next week).

Thomas
Attachment #8390838 - Flags: review?(vporof)
Attachment #8390838 - Flags: review?(mihai.sucan)
Comment on attachment 8390804 [details] [diff] [review]
bug859059-rev12.patch

Review of attachment 8390804 [details] [diff] [review]:
-----------------------------------------------------------------

Thanks for the update. No need to ask for re-review for such small changes.

::: browser/devtools/shared/Curl.jsm
@@ +134,5 @@
> +  }
> +};
> +
> +/**
> + * Misc. Utility functions for the Curl command generator.

Here I meant search 'misc. utility' and replace with 'Utility'. (no need for 'misc')
Attachment #8390804 - Flags: review?(mihai.sucan) → feedback+
(this bug was assigned to the wrong person)
Assignee: avinash → thomas
Status: NEW → ASSIGNED
Comment on attachment 8390838 [details] [diff] [review]
859059-curl-utils-test-preview.patch

Thanks for your patch.

No need to ask for two people to review patches, unless the patch includes changes each person needs to see (eg. if you change two different components). Victor is the best choice for reviewing netmonitor tests.
Attachment #8390838 - Flags: review?(mihai.sucan)
Comment on attachment 8390838 [details] [diff] [review]
859059-curl-utils-test-preview.patch

Review of attachment 8390838 [details] [diff] [review]:
-----------------------------------------------------------------

Lookin' really good! Kudos!

::: browser/devtools/netmonitor/test/browser_net_curl-utils.js
@@ +13,5 @@
> +    let { RequestsMenu } = NetMonitorView;
> +
> +    RequestsMenu.lazyUpdate = false;
> +
> +    aDebuggee.performRequests(SIMPLE_SJS);

This should called after waitForNetworkEvents.

@@ +23,5 @@
> +        multipart: RequestsMenu.getItemAtIndex(2),
> +      };
> +
> +      Task.spawn(function*() {
> +        createCurlData(requests.get.attachment, gNetwork).then((aData) => {

You need to yield all functions that return a promise, to wait for that promise to finish before continuing execution. So
> yield createCurlData...

@@ +27,5 @@
> +        createCurlData(requests.get.attachment, gNetwork).then((aData) => {
> +          test_findHeader(aData);
> +        });
> +
> +        createCurlData(requests.post.attachment, gNetwork).then((aData) => {

Yield.

@@ +32,5 @@
> +          test_isUrlEncodedRequest(aData);
> +          test_writePostDataTextParams(aData);
> +        });
> +
> +        createCurlData(requests.multipart.attachment, gNetwork).then((aData) => {

Yield!

@@ +62,5 @@
> +  let host = CurlUtils.findHeader(headers, "Host");
> +  let xRequestedWithLowerCased = CurlUtils.findHeader(headers, "x-requested-with");
> +  let doesNotExist = CurlUtils.findHeader(headers, "X-Does-Not-Exist");
> +
> +  is(host, "example.com", 

Nit: please remove all trailing whitespace from this file.

@@ +115,5 @@
> +}
> +
> +function createCurlData(aSelected, gNetwork) {
> +  let deferred = promise.defer();
> +  Task.spawn(function*() {

Task.spawn already returns a promise, so no need to create another deferred. Just
> return Task.spawn(function*() { ...

@@ +137,5 @@
> +      let postData = aSelected.requestPostData.postData.text;
> +      data.postDataText = yield gNetwork.getString(postData);
> +    }
> +
> +    deferred.resolve(data);

Just "return data" to resolve the task promise with this value.
Attachment #8390838 - Flags: review?(vporof) → feedback+
Attached patch bug859059-rev13.patch (obsolete) — Splinter Review
Addressing typo issue in comment #91

> No need to ask for two people to review patches, unless the patch includes changes each person needs to see (eg. if you change two different components).

I just wanted to be sure if I was on track regarding the request's request headers (bug 978144).

Thomas
Attachment #8390804 - Attachment is obsolete: true
Attachment #8393146 - Flags: review?(mihai.sucan)
Attached patch 859059-curl-utils-test.patch (obsolete) — Splinter Review
Awright! Here is a more mature util test, addressing issues from comment #94. I do not have a Windows system available at the moment so tests related to Windows stuff is fiddled with then tested on OSX.

See: 
test_removeBinaryDataFromMultipartText()
test_escapeStringWin()

Thomas
Attachment #8390838 - Attachment is obsolete: true
Attachment #8393151 - Flags: review?(vporof)
Comment on attachment 8393146 [details] [diff] [review]
bug859059-rev13.patch

Review of attachment 8393146 [details] [diff] [review]:
-----------------------------------------------------------------

Headers parsing and the way you work with multipart requests seems fine. Thanks!

::: browser/devtools/shared/Curl.jsm
@@ +325,5 @@
> +        continue;
> +      }
> +
> +      let indexOfColon = line.indexOf(":");
> +      let header = [line.slice(0, indexOfColon), line.slice(indexOfColon + 1)];

if indexOfColon == -1 just continue to the next line. you can consider this is an invalid line.
Attachment #8393146 - Flags: review?(mihai.sucan) → review+
Comment on attachment 8393151 [details] [diff] [review]
859059-curl-utils-test.patch

Review of attachment 8393151 [details] [diff] [review]:
-----------------------------------------------------------------

Looks good!

Can you please fold all the patches into a single one? I'll then push to the try servers, and if everything is green we're going to land it!
Attachment #8393151 - Flags: review?(vporof) → review+
Attached patch 859059-folded.patch (obsolete) — Splinter Review
Thanks!
Here is the patches folded. The if-test from comment #97 is included. 
This repo is also updated,

changeset:   174323:742b62803af3
tag:         qparent

All netmonitor tests pass on my setup. I cross my fingers for the Windows/curl related stuff ;)

Thomas
Attachment #8393773 - Flags: review?(vporof)
Comment on attachment 8384875 [details] [diff] [review]
859059-test-menu-item.patch

Please mark obsolete patches as obsolete in the future ;)
Attachment #8384875 - Attachment is obsolete: true
Attachment #8393146 - Attachment is obsolete: true
Attachment #8393151 - Attachment is obsolete: true
Attachment #8393773 - Flags: review?(vporof) → review+
A bit of orange there for browser_net_copy_as_curl.js :(
I'll fix it. I forgot to create expected result for the Windows escaped string.
On the bright side the CurlUtils seems to pass ;)
Hopefully this patch will pass on Windows

Modified browser/devtools/netmonitor/test/browser_net_copy_as_curl.js to use expected Windows result.
Attachment #8393773 - Attachment is obsolete: true
Attachment #8393886 - Flags: review?(vporof)
Comment on attachment 8393886 [details] [diff] [review]
859059-folded.patch

Review of attachment 8393886 [details] [diff] [review]:
-----------------------------------------------------------------

Add a checkin-needed keyword if try run is green ;)

https://tbpl.mozilla.org/?tree=Try&rev=be18913fc836
Attachment #8393886 - Flags: review?(vporof) → review+
There are some orange and one red. I think/hope I have dealt with the failures. 
Right now I am not sure what action to take next

https://developer.mozilla.org/en-US/docs/Developer_Guide/Committing_Rules_and_Responsibilities#Dealing_with_test_failures
Flags: needinfo?(vporof)
Those failures are unrelated. Let's land this!
Flags: needinfo?(vporof)
Keywords: checkin-needed
https://hg.mozilla.org/integration/fx-team/rev/70ed33107301

In the future, please make sure your patches have a good commit message on them when requesting checkin.
https://developer.mozilla.org/en-US/docs/Developer_Guide/Committing_Rules_and_Responsibilities#Checkin_comment
Flags: in-testsuite+
Keywords: checkin-needed
Whiteboard: [fixed-in-fx-team]
Something in this checkin-needed push caused Win7 talos xperf regressions. It wasn't clear to me which was at fault, so I backed out the entire push. Please verify that your patch wasn't at fault and re-request checkin when ready :)
https://hg.mozilla.org/integration/fx-team/rev/6d0a341040a9

https://tbpl.mozilla.org/php/getParsedLog.php?id=36500476&tree=Fx-Team
Whiteboard: [fixed-in-fx-team]
I don't think this patch can cause such regressions.
Keywords: checkin-needed
(In reply to Victor Porof [:vporof][:vp] from comment #110)
> I don't think this patch can cause such regressions.

Try run or no checkin.
Keywords: checkin-needed
|try: -b o -p win32 -u none[Windows 7] -t xperf| would have worked, but OK
xperf is green on the Try run, looks like bug 812712 is our lucky winner. Re-landed :)
https://hg.mozilla.org/integration/fx-team/rev/8629b33918d4
Whiteboard: [fixed-in-fx-team]
https://hg.mozilla.org/mozilla-central/rev/8629b33918d4
Status: ASSIGNED → RESOLVED
Closed: 10 years ago
Resolution: --- → FIXED
Whiteboard: [fixed-in-fx-team]
Target Milestone: --- → Firefox 31
Hi. Looking at the discussion here, I'm not sure whether converting request payload to a --data-binary addition is supposed to be implemented (and if my problem is a bug or not). Is it?

Here I have a POST request with a payload. I see it in DevTools / Network / Params, but "Copy as cURL" emits a command without any --data-binary command. The Chrome DevTools equivalent does emit a --data-binary cURL parameter.
Product: Firefox → DevTools
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: