Skip to content

scrapy-selenium is yielding normal scrapy.Request instead of SeleniumRequest #78

Open
@iamumairayub

Description

@iamumairayub

@clemfromspace I just decided to use your package in my Scrapy project but it is just yielding normal scrapy.Requuest instead of SeleniumRequest

from shutil import which
from scrapy_selenium import SeleniumRequest
from scrapy.contracts import Contract
class WithSelenium(Contract):
    """ Contract to set the request class to be SeleniumRequest for the current call back method to test
    @with_selenium
    """
    name = 'with_selenium'
    request_cls = SeleniumRequest
    
class WebsiteSpider(BaseSpider):
    name = 'Website'

    custom_settings = {
        'DOWNLOADER_MIDDLEWARES': {
             'scrapy_selenium.SeleniumMiddleware': 800
        },
        'SELENIUM_DRIVER_NAME': 'firefox',
        'SELENIUM_DRIVER_EXECUTABLE_PATH': which('geckodriver'),
        'SELENIUM_DRIVER_ARGUMENTS': ['-headless']  
    }
    
    def start_requests(self):
		yield SeleniumRequest(url=url, 
			callback=self.parse_result)
                
				
    def parse_result(self, response):
        """
        @with_selenium
        """
        print(response.request.meta['driver'].title)     --> gives key error   

I have seen this issue but this is not helpful at all

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions