kcks66

Forum Replies Created

Viewing 15 posts - 46 through 60 (of 181 total)
  • Author
    Posts
  • in reply to: RapidGate #14449
    kcks66
    Participant

    Hi Mikhail,

    I checked all the Calculated Channels data were not uploaded into Target PC.

    Below is data from DAT files before upload:

    https://ibb.co/8PdLRyF

    Below is data from PostgreSql after upload:

    https://ibb.co/jMzv84S

    Is this normal?
    Thank you.

    in reply to: RapidGate #14445
    kcks66
    Participant

    Hi Mikhail,

    I found many similar of this:

    2024-04-18 16:28:18 Error transferring historical data:
    System.IO.IOException: Unable to read data from the transport connection: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond..
    —> System.Net.Sockets.SocketException (10060): A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond.
    at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 count)
    — End of inner exception stack trace —
    at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 count)
    at Scada.Client.ClientBase.ReceiveResponse(DataPacket request)
    at Scada.Client.ScadaClient.WriteChannelData(Int32 archiveMask, ICollection`1 slices, WriteDataFlags flags)
    at e.j()

    in reply to: RapidGate #14441
    kcks66
    Participant

    Hi Mikhail,

    However, I found there is data lost after upload and convert into Postgresql data.

    Below is data from DAT files before upload:

    https://ibb.co/6b9DWpD

    Below is data from PostgreSql after upload:

    https://ibb.co/gtBcMD8

    Pls advice for did I miss any steps or wrong setting?
    Thank you.

    in reply to: RapidGate #14440
    kcks66
    Participant

    Hi Mikhail,

    I manually create partition in pgAdmin as your advice. It’s worked.

    https://ibb.co/XzTPsHG

    Looks like my data is huge. It takes long hours to transfer few days data. I have data from 2020 to 2024 to transfer, I wonder how long will it takes to finish them.

    in reply to: RapidGate #14428
    kcks66
    Participant

    Hi Mikhail,

    That’s great I can see historical data uploading status in the RapidGate log files. However, there is problem occur at the Target PC side:

    https://ibb.co/6HBfX7G

    Error message:

    2024-04-17 17:48:23 Error writing to database:
    Npgsql.PostgresException (0x80004005): 23514: no partition of relation “min_historical” found for row

    DETAIL: Detail redacted as it may contain sensitive data. Specify ‘Include Error Detail’ in the connection string to include this information.
    at Npgsql.Internal.NpgsqlConnector.<ReadMessage>g__ReadMessageLong|223_0(NpgsqlConnector connector, Boolean async, DataRowLoadingMode dataRowLoadingMode, Boolean readingNotifications, Boolean isReadingPrependedMessage)
    at Npgsql.NpgsqlDataReader.NextResult(Boolean async, Boolean isConsuming, CancellationToken cancellationToken)
    at Npgsql.NpgsqlCommand.ExecuteReader(CommandBehavior behavior, Boolean async, CancellationToken cancellationToken)
    at Npgsql.NpgsqlCommand.ExecuteReader(CommandBehavior behavior, Boolean async, CancellationToken cancellationToken)
    at Npgsql.NpgsqlCommand.ExecuteNonQuery(Boolean async, CancellationToken cancellationToken)
    at Scada.Server.Modules.ModArcPostgreSql.Logic.PointQueue.ProcessItems()
    Exception data:
    Severity: ERROR
    SqlState: 23514
    MessageText: no partition of relation “min_historical” found for row
    Detail: Detail redacted as it may contain sensitive data. Specify ‘Include Error Detail’ in the connection string to include this information.
    SchemaName: mod_arc_postgre_sql
    TableName: min_historical
    File: execPartition.c
    Line: 335
    Routine: ExecFindPartition

    Pls advice and thank you.

    in reply to: RapidGate #14422
    kcks66
    Participant

    Hi Mikhail,

    I re-setup another PC with RapidGate module. The Upload options appear in this PC.

    https://ibb.co/Zc8yNgD

    Pls advice for how should I configure to upload all historical data in DAT format from this PC to store in PostgreSql database at target PC?

    Thank you.

    in reply to: RapidGate #14418
    kcks66
    Participant

    Hi Mikhail,

    For my case which I want to convert historical data from DAT format to store in PostgreSql database which option should I use?

    What should I do to make the Upload options that no display in my RapidGate to reappear back?

    in reply to: Convert historical data from V5 to V6 #14397
    kcks66
    Participant

    Hi Mikhail,

    I finally managed to setup another PC with Rapid SCADA configuration exactly same with the target PC Rapid SCADA. I managed to login target PC via RapidGate. The next step is pls advice for which part I should configure?

    https://ibb.co/L00KSv0
    This Historical Data Transfer Option?

    https://ibb.co/GtXdzWb
    This Achieve Replication Option?

    FYI, all channels from Target PC and RapidGate PC are configured exactly same Channel ID. Target PC was configured with Archives set at ModArcPostgreSql.

    Thank you.

    in reply to: Convert historical data from V5 to V6 #14331
    kcks66
    Participant

    Hi Mikhail,

    > However, why don’t you use *.dat format as a secondary to make the system more reliable? Yes, I will consider this.

    Thanks for your suggestion. Since my project is non-commercial value, I would like to explore > Deploy an additional instance of Rapid SCADA and use Rapid Gate module to transfer data from the additional instance to the main instance of Rapid SCADA.

    I will get another PC to setup another Rapid SCADA and load with all the *.dat files. Will try it out and raise questions when the setup is ready.

    Thank you.

    in reply to: Convert historical data from V5 to V6 #14314
    kcks66
    Participant

    Hi Mikhail,

    Thanks. I tried all of them and they worked accordingly.

    After I discussed and showed to the system users, we finally decided to use PostgreSql as only one achieve as main achieve.

    My next step is to migrate all the *.dat files (in v6 format already) into Postgresql database. Pls advice any faster or easier way to do such migration?

    Thank you.

    in reply to: Convert historical data from V5 to V6 #14306
    kcks66
    Participant

    Hi Mikhail,

    I would like to explore all of these advice.

    1) How shall I configure the chart profile to open data from *.dat archive?

    https://ibb.co/Gpz254z

    2) Am I need to enable/true the below?

    <!– User can select archive –>
    <SelectArchive>false</SelectArchive>

    3) I need to active both set of Achieve like below?

    https://ibb.co/RPkhFbf

    Thank you.

    in reply to: Convert historical data from V5 to V6 #14302
    kcks66
    Participant

    Hi Mikhail,

    Previous Rapid SCADA v5 I use .dat as my main achieve. Right now after I migrated to v6 I want to use ModArcPostgreSql as my main achieve.

    Those .dat files from v5 I used the converter to convert to v6 .dat files format, can be trend by modArcBasic only, right?

    Pls advice for what should I do after I set ModArcPostgreSql as my main achieve and still can trend back those converted historical data previously from v5?

    Thank you.

    • This reply was modified 1 year, 6 months ago by kcks66.
    in reply to: Convert historical data from V5 to V6 #14301
    kcks66
    Participant

    Hi Mikhail,

    I can see the new converter can continue to run after encountered those data more than 1440 per day.

    https://ibb.co/34tkBJV

    in reply to: Current Data sudden become Not Ready #14293
    kcks66
    Participant

    Hi Mikhail,

    Today the PC encountered actual power supply tripped. The PC auto resume after around 30 minutes later. I checked Rapid SCADA and everything is working well.

    I think the Automatic (Delayed Start) setting worked.

    in reply to: Convert historical data from V5 to V6 #14292
    kcks66
    Participant

    Hi Mikhail,

    I can see the error .dat file contain two data per one minute, if compared to other success .dat file contain one data per minute:

    https://ibb.co/10rWGDy

    Any .dat file with more than 1440 will be treated as error.

Viewing 15 posts - 46 through 60 (of 181 total)