只是一个用于缓存 reddit 帖子的简单库
项目描述
红区
只是一个 python 库,使 reddit 帖子缓存更容易。
缓存选项
- 内存缓存
- Redis 缓存
- 泡菜缓存
用法
安装:
- 发展
poetry add git+https://github.com/CaffeineDuck/reddist
- 稳定的
poetry add reddist
泡菜用法:
import asyncio
import random
from dataclasses import asdict
async def main():
reddit_cacher = PickleRedditCacher(
Reddit(
user_agent="dpydit",
client_id="CLIENT_ID",
client_secret="CLIENT_SECRET",
),
'cache.pickle',
cached_posts_count=100,
)
reddit_cacher.start_caching()
posts = await reddit_cacher.get_subreddit_posts("pics")
print(asdict(random.choice(posts)))
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
内存使用情况:
import asyncio
import random
from dataclasses import asdict
async def main():
reddit_cacher = MemoryRedditCacher(
Reddit(
user_agent="dpydit",
client_id="CLIENT_ID",
client_secret="CLIENT_SECRET",
),
cached_posts_count=100,
)
reddit_cacher.start_caching()
posts = await reddit_cacher.get_subreddit_posts("pics")
print(asdict(random.choice(posts)))
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
Redis 用法:
import asyncio
import random
from dataclasses import asdict
import aioredis
async def main():
redis = aioredis.from_url(
"redis://localhost"
)
async with redis.client() as conn:
reddit_cacher = RedisRedditCacher(
Reddit(
user_agent="dpydit",
client_id="CLIENT_ID",
client_secret="CLIENT_SECRET",
),
conn,
cached_posts_count=100
)
posts = await reddit_cacher.get_subreddit_posts("pics")
print(asdict(random.choice(posts)))
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
WIP(预期重大变化)
项目详情
下载文件
下载适用于您平台的文件。如果您不确定要选择哪个,请了解有关安装包的更多信息。
源分布
reddist-0.1.2.tar.gz
(4.8 kB
查看哈希)
内置分布
reddist-0.1.2-py3-none-any.whl
(6.3 kB
查看哈希)