1

我正在为这个拥有大量用户的网站构建新闻评级脚本。我正在努力使这个网站尽可能高效,现在我想知道跟踪投票的最有效方法是什么。当然,我不希望用户多次投票。

我的第一个想法是将它存储在我的 MySQL 数据库中,但我担心这会对我的网站速度产生负面影响,因为这个表会变得很大。

将它存储在数据库中仍然是最好的解决方案还是有更好的解决方案。

4

5 回答 5

1

If you plan on having > 1,000,000 records you should make sure the table's structure is efficient (which shouldnt be hard for your example) and that you index it correctly.

Memcached would be the simplest way to implement caching and is easy to scale if your site grows and more servers are necessary.

于 2009-12-22T17:01:00.030 回答
1

With a properly indexed vote table, you can keep reasonable performance regardless of how large your table is (of course, beyond a certain point, your tables will be too large to fit in cache, but that would involve having a very large number of users and items).

Add in some per-user caching (on the client, in $_SESSION, using memcached) and you can get a quite fast "no" response time).

于 2009-12-22T17:01:07.953 回答
0

Memcached将是一个非常好的方法来做到这一点。您需要不时从 memcached 同步(我会使用 mysql 服务器上的 cron 脚本使用 pull 模型执行此操作)。

于 2009-12-22T16:58:54.423 回答
0

Since you can't use memcached I would say this. A decent database server ( decent hardware + decent db implementation) should be able to handle this quite well. A single table with a physical index on article-id and a second entry representing the vote will handle a few googillion (yes I made up the word) articles easily :P

Rationale :

Database servers maintain statistics -- read: self-tuning -- and only hot items (index + row-entries) remain in-memory.

Moral:

Don't worry about such things unless they become a problem -- i.e., If your company is the size of facebook I would worry.

于 2009-12-22T17:05:36.173 回答
0

你看到这个了吗?

http://destiney.com/php#Destiney_rated_images

演示在这里:http ://ratedsite.com/

于 2009-12-22T17:08:58.520 回答