您好我一直连不上mongodb数据库
这连接的是centos里那mongodb数据库吧,ip地址是也是虚拟机·的ip地址吧,我虚拟机安装好了这mongodb,代码也换成下载的的课程里的代码,只是改了下ip地址,总是报错timeout
2020-03-04 16:13:18 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2020-03-04 16:13:18 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023
2020-03-04 16:13:19 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://xiaoguotu.to8to.com/tuce/p_1.html> (referer: None)
{b'User-Agent': [b'Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.6 (KHTML, like Gecko) Chrome/20.0.1090.0 Safari/536.6'], b'Accept': [b'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8'], b'Accept-Language': [b'en'], b'Accept-Encoding': [b'gzip,deflate']}
2020-03-04 16:13:19 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://xiaoguotu.to8to.com/case/list?a2=0&a12=&a11=117529&a1=0&a17=1> (referer: https://xiaoguotu.to8to.com/tuce/p_1.html)
2020-03-04 16:13:19 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://xiaoguotu.to8to.com/case/list?a2=0&a12=&a11=3095912&a1=0&a17=1> (referer: https://xiaoguotu.to8to.com/tuce/p_1.html)
2020-03-04 16:13:50 [scrapy.core.scraper] ERROR: Error processing {'content_id': '117529',
'content_name': '质朴31平混搭小户型客厅实景图片',
'content_url': 'https://xiaoguotu.to8to.com/case/list?a2=0&a12=&a11=117529&a1=0&a17=1',
'image_urls': ['https://pic1.to8to.com/case/social/20190916/d1ef8bdcf5ab069955bd4aa3b58b17d0.jpg'],
'nick_name': '无解方程',
'pic_name': ''}
Traceback (most recent call last):
File "F:\python\lib\site-packages\twisted\internet\defer.py", line 654, in _runCallbacks
current.result = callback(current.result, *args, **kw)
File "C:\lujing_scrapy_project\tubatu_scrapy_project\tubatu_scrapy_project\pipelines.py", line 22, in process_item
self.mycollection.insert_one(data)
File "F:\python\lib\site-packages\pymongo\collection.py", line 698, in insert_one
session=session),
File "F:\python\lib\site-packages\pymongo\collection.py", line 612, in _insert
bypass_doc_val, session)
File "F:\python\lib\site-packages\pymongo\collection.py", line 600, in _insert_one
acknowledged, _insert_command, session)
File "F:\python\lib\site-packages\pymongo\mongo_client.py", line 1490, in _retryable_write
with self._tmp_session(session) as s:
File "F:\python\lib\contextlib.py", line 112, in __enter__
return next(self.gen)
File "F:\python\lib\site-packages\pymongo\mongo_client.py", line 1823, in _tmp_session
s = self._ensure_session(session)
File "F:\python\lib\site-packages\pymongo\mongo_client.py", line 1810, in _ensure_session
return self.__start_session(True, causal_consistency=False)
File "F:\python\lib\site-packages\pymongo\mongo_client.py", line 1763, in __start_session
server_session = self._get_server_session()
File "F:\python\lib\site-packages\pymongo\mongo_client.py", line 1796, in _get_server_session
return self._topology.get_server_session()
File "F:\python\lib\site-packages\pymongo\topology.py", line 485, in get_server_session
None)
File "F:\python\lib\site-packages\pymongo\topology.py", line 209, in _select_servers_loop
self._error_message(selector))
pymongo.errors.ServerSelectionTimeoutError: 192.168.1.9:27017: timed out
正在回答 回答被采纳积分+1
- 参与学习 人
- 提交作业 107 份
- 解答问题 1672 个
Python最广为人知的应用就是爬虫了,有趣且酷的爬虫技能并没有那么遥远,本阶段带你学会利用主流Scrapy框架完成爬取招聘网站和二手车网站的项目实战。
了解课程
恭喜解决一个难题,获得1积分~
来为老师/同学的回答评分吧
0 星