I'm currently creating a crawler. I'm crawling, writing a file locally as a stream, reloading the file, and uploading it to s3. I am using socket communication with socket.io
to create and use the UI. I want to send a data
object from the server to the client after it is uploaded to s3. But this process is not solved, so ask.
Below is the code. The file is divided into three.
// models/**.js
(...)
await uploadFile(filename)
Here is the part that crawls, saves the file as a stream, and then calls s3.js
.
// s3.js
(...)
const uploadFile = async filename => {
await fs.readFile(join(__dirname, output, filename), (err, data) => {
s3Upload(filename, data)
})
}
const s3Upload = async (filename, data) => {
await s3.upload({
Bucket: 'work543-express-crawler',
Key: filename,
Body: data,
ContentEncoding: 'utf-8',
ACL: 'public-read-write'
}, (err, data) => data ) // <-- this line
}
I need to send data to socket.emit()
in the callback of s3.upload()
but I am not sure how to do it.
// socketIo.js
(...)
const socketServer = app => {
const socketio = require('socket.io')(app)
socketio.on('connection', socket => {
socket.on('append', async data => {
queue(data)
})
socket.emit(someEventName, theData) // <-- this line is pseudo code
})
}
I am thinking below how to do data
in socket.emit()
:
- It seems that
data
, the callback argument ofs3.upload()
, should be exported. - Or some event should occur that connects
socket.emit()
ands3.upload()
.
I would like to know if you are thinking in this direction. If yes, please give me a guide on how to fix it.
Thank you.
ps. If you can't figure it out with just the code above, please comment.
Aucun commentaire:
Enregistrer un commentaire