# fengliu_backend **Repository Path**: LHY1999/fengliu_backend ## Basic Information - **Project Name**: fengliu_backend - **Description**: No description available - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2018-12-16 - **Last Updated**: 2020-12-19 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # 2018后端 第一次考核 - ### python部分 1. 换源后安装好requests ![image](https://ws1.sinaimg.cn/large/006ZR1OQly1fy8shx9h9sj31hc0u0gpq.jpg) 2. 写好代码 ``` # !/usr/bin/env python # -*- encoding: utf-8 -*- """ @Author : {Richard Li} @License : (C) Copyright 2013-2017, {BUPT} @Contact : {543306408@qq.com} @Software: PyCharm @File : fengliu_python_test.py @Time : 2018/12/16 16:34 @Desc : """ import pymysql import requests from lxml import etree def get_the_data(): base_url = 'https://gitee.com/sicefengliu/2018_stage_operation_layout/blob/master/backend/1.md' try: trial_req = requests.get(base_url) except requests.exceptions.HTTPError: print("the request is refused") trial_data = trial_req.text selector = etree.HTML(trial_data) the_title = selector.xpath('/html/head/title/text()') return trial_data, the_title def write_into_txt(the_datas): outfile = open("the_datas.txt", 'w', encoding='utf-8') outfile.writelines(the_datas) outfile.close() def write_into_mysql(title): # 插入操作 db = pymysql.connect(host="localhost", user="root", password="123456", db="fengliu", port=3306) # 使用cursor()方法获取操作游标 cur = db.cursor() sql_insert = """insert into fengliu(id,title) values({0},"{1}");""".format(3, title) try: cur.execute(sql_insert) db.commit() except Exception as e: db.rollback() db.close() if __name__ == '__main__': # 通过爬虫获得数据 print("start get the datas") the_datas, title = get_the_data() print(the_datas, title) print("success!!!") # 存储数据 print("start save the datas") write_into_txt(the_datas) write_into_mysql(title) print("success!!!") print("end_of_the_spyder!") ``` a) 完成的功能: - 可以爬取网页代码; - 利用xpath获取元素信息; - 利用pymysql将爬取到的信息储存到数据库中; - 将信息以txt文本的格式储存下来; b) 比较遗憾的是,由于时间关系,没有做好异常的处理,代码写的也不是很好; c) 以下是一些完成后的截图 - xampp 与 mysql数据库 ![image](https://ws1.sinaimg.cn/large/006ZR1OQly1fy8sv2lrzwj31hc0u0jwe.jpg) ![image](https://ws1.sinaimg.cn/large/006ZR1OQly1fy8svozwcej31hc0u0tdz.jpg) - 爬取之后的结果 ![image](https://ws1.sinaimg.cn/large/006ZR1OQly1fy8swr84oyj31hc0u00vd.jpg) ![image](https://ws1.sinaimg.cn/large/006ZR1OQly1fy8svkrlsqj31hc0u0q82.jpg) ![image](https://ws1.sinaimg.cn/large/006ZR1OQly1fy8swv5h76j31hc0u07wh.jpg) - ### linux部分 1. 在vm上安装好ubantu ![image](https://ws1.sinaimg.cn/large/006ZR1OQly1fy8t7npve5j31hc0u0tdf.jpg) 2. 下载好xshell,并连接服务器; ![image](https://ws1.sinaimg.cn/large/006ZR1OQly1fy8t70lhioj31hc0u00vl.jpg) 3. 写完这篇文章并搞定图床 ![image](https://ws1.sinaimg.cn/large/006ZR1OQly1fy8t2g0aa6j31hc0u0apv.jpg) ![image](https://ws1.sinaimg.cn/large/006ZR1OQly1fy8t2uvovuj31hc0u0do4.jpg) 4. 提交作业 - ### 个人感想部分 我通过这次作业复习了python和linux,学习了xshell的使用,同时也感受到了明天要交作业,但现在还没开始写的恐惧,收获颇丰,最后祝大家期末顺利~ - ### 作业总结 基础任务全部完成,进阶学习完成了sql语句的学习;