5

我有一个Job可以有很多附件的模型。该Attachment模型上安装了一个 CarrierWave 上传器。

class Job < ActiveRecord::Base
  has_many :attachments
end

class Attachment < ActiveRecord::Base
  mount_uploader :url, AttachmentUploader

  belongs_to :job
end

可以克隆作业,克隆作业应创建新的作业和附件记录。这部分很简单。

然后系统需要将物理文件复制到与克隆作业关联的上传位置。
有没有一种简单的方法可以使用 CarrierWave 做到这一点?该解决方案应同时支持本地文件系统和 AWS S3。

class ClonedJob
  def self.create_from(orig_job)
    @job_clone = orig_job.dup

    if orig_job.attachments.any?
      orig_job.attachments.each do |attach|
        cloned_attactment = attach.dup
        # Need to physically copy files at this point. Otherwise
        # this cloned_attachment will still point to the same file 
        # as the original attachment.
        @job_clone.attachments << cloned_attachment
      end
    end
  end
end
4

2 回答 2

3

我已经粘贴在我拼凑在一起来完成此任务的模块下方。它有效,但如果它足够重要,我仍然会改进一些事情。我只是把我的想法留在了代码中。

require "fileutils"

# IDEA: I think it would make more sense to create another module
# which I could mix into Job for copying attachments. Really, the
# logic for iterating over attachments should be in Job. That way,
# this class could become a more generalized class for copying
# files whether we are on local or remote storage.
#
# The only problem with that is that I would like to not create
# a new connection to AWS every time I copy a file. If I do then
# I could be opening loads of connections if I iterate over an
# array and copy each item. Once I get that part fixed, this
# refactoring should definitely happen.

module UploadCopier
  # Take a job which is a reprint (ie. it's original_id
  # is set to the id of another job) and copy all of 
  # the original jobs remote files over for the reprint
  # to use.
  #
  # Otherwise, if a user edits the reprints attachment
  # files, the files of the original job would also be
  # changed in the process.
  def self.copy_attachments_for(reprint)
    case storage
    when :file
      UploadCopier::LocalUploadCopier.copy_attachments_for(reprint)
    when :fog 
      UploadCopier::S3UploadCopier.copy_attachments_for(reprint)
    end
  end

  # IDEA: Create another method which takes a block. This method
  # can check which storage system we're using and then call
  # the block and pass in the reprint. Would DRY this up a bit more.

  def self.copy(old_path, new_path)
    case storage
    when :file
      UploadCopier::LocalUploadCopier.copy(old_path, new_path)
    when :fog 
      UploadCopier::S3UploadCopier.copy(old_path, new_path)
    end
  end

  def self.storage
    # HACK: I should ask CarrierWave what method to use
    # rather than relying on the config variable.
    APP_CONFIG[:carrierwave][:storage].to_sym 
  end

  class S3UploadCopier
    # Copy the originals of a certain job's attachments over
    # to a location associated with the reprint.
    def self.copy_attachments_for(reprint)
      reprint.attachments.each do |attachment|
        orig_path = attachment.original_full_storage_path
        # We can pass :fog in here without checking because
        # we know it's :fog since we're in the S3UploadCopier.
        new_path = attachment.full_storage_path
        copy(orig_path, new_path)
      end
    end

    # Copy a file from one place to another within a bucket.
    def self.copy(old_path, new_path)
      # INFO: http://goo.gl/lmgya
      object_at(old_path).copy_to(new_path)
    end

  private

    def self.object_at(path)
      bucket.objects[path]
    end

    # IDEA: THis will be more flexible if I go through
    # Fog when I open the connection to the remote storage.
    # My credentials are already configured there anyway.

    # Get the current s3 bucket currently in use.
    def self.bucket
      s3 = AWS::S3.new(access_key_id: APP_CONFIG[:aws][:access_key_id],
        secret_access_key: APP_CONFIG[:aws][:secret_access_key])
      s3.buckets[APP_CONFIG[:fog_directory]]
    end
  end

  # This will only be used in development when uploads are
  # stored on the local file system.
  class LocalUploadCopier
    # Copy the originals of a certain job's attachments over
    # to a location associated with the reprint.
    def self.copy_attachments_for(reprint)
      reprint.attachments.each do |attachment|
        # We have to pass :file in here since the default is :fog.
        orig_path = attachment.original_full_storage_path
        new_path = attachment.full_storage_path(:file)
        copy(orig_path, new_path)
      end
    end

    # Copy a file from one place to another within the
    # local filesystem.
    def self.copy(old_path, new_path)
      FileUtils.mkdir_p(File.dirname(new_path))
      FileUtils.cp(old_path, new_path)
    end
  end
end

我这样使用它:

# Have to save the record first because it needs to have a DB ID.
if @cloned_job.save
  UploadCopier.copy_attachments_for(@cloned_job)
end
于 2012-09-07T09:49:02.683 回答
1
class Job < ActiveRecord::Base
  has_many :attachments
end

class Attachment < ActiveRecord::Base
  mount_uploader :attachment, AttachmentUploader
  belongs_to :job
end

class ClonedJob
  def self.create_from(orig_job)
    @job_clone = orig_job.dup

    if orig_job.attachments.any?
      orig_job.attachments.each do |attach|
        cloned_attachment = attach.dup
        @job_clone.attachments << cloned_attachment
        # !!! Here is the trick
        cloned_attachment.remote_attachment_url = attach.attachment_url
      end
    end
  end
end
于 2013-04-10T15:46:20.047 回答