admin管理员组

文章数量:1289911

I was wondering if it is possible to extract the parameters of a JavaScript function with Scrapy, from a code similar to this one:

<script type="text/javascript">
    var map;
  function initialize() {
    var fenway = new google.maps.LatLng(43.2640611,2.9388228);
  };
}
</script>

I would like to extract the coordinates 43.2640611 and 2.9388228.

I was wondering if it is possible to extract the parameters of a JavaScript function with Scrapy, from a code similar to this one:

<script type="text/javascript">
    var map;
  function initialize() {
    var fenway = new google.maps.LatLng(43.2640611,2.9388228);
  };
}
</script>

I would like to extract the coordinates 43.2640611 and 2.9388228.

Share Improve this question edited Dec 14, 2017 at 3:21 alecxe 474k127 gold badges1.1k silver badges1.2k bronze badges asked Feb 25, 2014 at 10:32 AritzBiAritzBi 1973 silver badges10 bronze badges 2
  • 1 What do you mean by 'extract'. You can define var lat=43.26, lng=2.93 and than pass it to the method? – Aamir Afridi Commented Feb 25, 2014 at 10:40
  • @AamirAfridi The OP means - extract using Scrapy, the Python crawler engine. – lexicore Commented Nov 4, 2014 at 6:59
Add a ment  | 

2 Answers 2

Reset to default 11 +25

This is where re() method would help.

The idea is to locate the script tag via xpath() and use re() to extract the lat and lng from the script tag's contents. Demo from the scrapy shell:

$ scrapy shell index.html
>>> response.xpath('//script').re(r'new google\.maps\.LatLng\(([0-9.]+),([0-9.]+)\);')
[u'43.2640611', u'2.9388228']

where index.html contains:

<script type="text/javascript">
    var map;
  function initialize() {
    var fenway = new google.maps.LatLng(43.2640611,2.9388228);
  };
}
</script>

Of course, in your case the xpath would not be just //script.

FYI, new google\.maps\.LatLng\(([0-9.]+),([0-9.]+)\); regular expression uses the saving groups ([0-9.]+) to extract the coordinate values.

Also see Using selectors with regular expressions.

Disclaimer: I haven't tried this approach, but here's how I would think about it if I was constrained to using Scrapy and didn't want to parse JavaScript the way alecxe suggested above. This is a finicky, fragile hack :-)

You can try using scrapyjs to execute the JavaScript code from your scrapy crawler. In order to capture those parameters, you'd need to do the following:

  1. Load the original page and save it to disk.
  2. Modify the page to replace google.maps.LatLng function with your own (see below). make sure to run your script AFTER google js is loaded.
  3. Load the modified page using scrapyjs (or the instance of webkit created by it)
  4. Parse the page, look for the two special divs created by your fake LatLng function that contain the extracted lat and lng variables.

More on step 2: Make your fake LatLng function modify the HTML page to expose lat and lng variables so that you could parse them out with Scrapy. Here is some crude code to illustrate:

var LatLng = function LatLng(lat, lng) {
  var latDiv = document.createElement("div");
  latDiv.id = "extractedLat";
  latDiv.innerHtml = lat;
  document.body.appendChild(latDiv);

  var lngDiv = document.createElement("div");
  lngDiv.id = "extractedLng";
  lngDiv.innerHtml = lng;
  document.body.appendChild(lngDiv);
}

google = {
  map: {
    LatLng: LatLng
  }
};

Overall, this approach sounds a bit painful, but could be fun to try.

本文标签: pythonGet the parameters of a JavaScript function with ScrapyStack Overflow