showmustgoon

Forum Replies Created

Viewing 15 posts - 1 through 15 (of 18 total)
  • Author
    Posts
  • in reply to: Precision Loss Due 64-bit data #15410
    showmustgoon
    Participant

    My consideration is that when writing data to a channel via a command, it is still necessary to know the channel type in order to process the data entered by the user with special handling, rather than simply converting it directly to a double. Of course, this step can be automatically adapted by the driver.

    in reply to: Extension DLL Constraints #15408
    showmustgoon
    Participant

    Yes,I had previously overlooked this correspondence. when modules/drivers are enabled, the corresponding module/driver code is recorded in the configuration file of either Comm or Server. Eventually, during runtime, the service loads the {0}.Logic.dll
    The namespace of the Logic class that it implements also needs to be: Scada.Server.Modules.{0}.Logic.{0}Logic,
    or
    Scada.Comm.Drivers.{0}.Logic.{0}Logic

    where {0} represents the code returned by the View for the module/driver.

    • This reply was modified 11 months, 1 week ago by showmustgoon.
    in reply to: Precision Loss Due 64-bit data #15399
    showmustgoon
    Participant

    You can check the data type of the channel – integer or double (or null by default) and perform different transformations depending on this.

    Your solution is great, I am planning to additionally extend the INT64 and UINT64 types in the channel table.

    • This reply was modified 11 months, 1 week ago by showmustgoon.
    in reply to: Precision Loss Due 64-bit data #15394
    showmustgoon
    Participant

    Some other SCADA products I know of require configuring both the original data type and the converted data type to ensure the expected data conversion. For this purpose, the optional original types can be a very complex list, covering thousands of types across dozens of protocols.

    In RapidSCADA, the type conversion is done automatically internally, converting to double for subsequent calculations, statistics, and other logic.

    I believe data of this length is not a common requirement. If such a situation does arise, I might consider adding an additional type attribute when configuring the data point, performing extra processing during conversion, and potentially abandoning the calculation and statistical functionalities.

    Additionally, I’m not sure if replacing double with decimal is a good idea. I only know that decimal offers more precision but am not familiar with the underlying logic.

    in reply to: Logic of archiveMask #15383
    showmustgoon
    Participant

    I think I finally figured it out.
    When writing, it’s possible to write to multiple archives simultaneously, so using a mask is appropriate.
    However, when reading, you can only read from a single archive, so a bit is used.
    Additionally, I made another mistake: I used the maximum and minimum values of DateTime as parameters for TimeRange, which caused an overflow.

    My test code is as follows, and the archiveMask and archiveBit used for writing and reading correspond to each other.

    
    if (DateTime.Now-LastTime>=TimeSpan.FromSeconds(10))
            {
                LastTime=DateTime.Now;
                ServerContext.WriteEvent(16,new Event
                {
                    Timestamp = DateTime.UtcNow,
                    CnlNum = 1,
                     DeviceNum = 1,
                    CnlVal = 11,
                     CnlStat = CnlStatusID.Defined,
                    Text = "123321",
                    Data = new byte[]
                    {0x00,0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08, 0x09 },
                    // Position = 0
                });
               var r= ServerContext.GetEvents(4, new TimeRange(DateTime.Today.AddDays(-2),DateTime.Today.AddDays(2), true), null);
               Log.WriteMessage($"Custom Events{r.Count},{string.Join(",",r.Select(x=>x.Timestamp))}",LogMessageType.Info);
            }
    
    • This reply was modified 11 months, 1 week ago by showmustgoon.
    in reply to: Logic of archiveMask #15382
    showmustgoon
    Participant

    And…..archiveBit?
    I just noticed that when writing an Event, archiveMask is used, but when getting Event, archiveBit is used. This should be the reason why I can’t read the events I wrote. I am now tracking the logic of archiveBit.

    in reply to: Precision Loss Due 64-bit data #15369
    showmustgoon
    Participant

    I’m using OPC UA right now to collect data and send commands between SCADA and CodeSys.
    The main issue we’ve noticed is that large 64-bit integers lose precision when being collected and uploaded.
    I’m not sure yet if this is a big enough concern to address and are still deciding whether to make adjustments or just limit operations to avoid the issue.

    in reply to: Precision Loss Due 64-bit data #15361
    showmustgoon
    Participant

    That’s a good idea. I believe the parsing logic needs to be modified accordingly as well. this might affect some built-in functionalities of the system (such as statistics). I’ll try to resolve the issue in this direction.

    showmustgoon
    Participant

    I want to implement a page in WebStation to display some custom computed data, but I think these calculations should be performed, cached, and persisted in a custom module on the Server.
    I’m currently unsure how to achieve data interaction between the Plg in WebStation and the Module in the Server.

    Or I perhaps move all the computation logic to WebStation instead?

    • This reply was modified 11 months, 2 weeks ago by showmustgoon.
    in reply to: Webstation issues on Win11 #15273
    showmustgoon
    Participant

    Make sure that the AspNetCoreModuleV2 is already present in your module. It would be best if you could provide a screenshot of the error encountered when accessing the website.

    in reply to: Webstation issues on Win11 #15270
    showmustgoon
    Participant

    It seems like your module is missing the AspNetCoreModuleV2.
    I recommend trying to reinstall or repair the ASP.NET Core Runtime.
    The issue might be related to the installation order—sometimes the ASP.NET Core Runtime doesn’t correctly detect the IIS installation, leading to the absence of this module.
    I’ve encountered this issue frequently in my own deployment scripts, so I usually force a repair of the ASP.NET Core Runtime after the initial installation to resolve it.

    in reply to: Request to Upload RapidSCADA 6.0 Videos to Bilibili #15235
    showmustgoon
    Participant

    I have successfully uploaded the RapidSCADA 6.0 video tutorials to Bilibili. You can find the videos at the following link: https://www.bilibili.com/video/BV1TssJeREvJ/
    I have ensured that all original content remains unaltered and have included full credit to RapidSCADA, along with the original links to the YouTube videos.

    Thank you once again for granting me permission to share these tutorials with the developer community in China. I hope this will be helpful to many who were unable to access the content before.

    in reply to: Seeking Ways to Give Back for RapidSCADA #15213
    showmustgoon
    Participant

    Thank you all for the suggestions!
    My company hired just me, a newcomer, to take on this project (based on RapidSCADA), so I’m not sure if I can manage the technical challenge. But I’ll start by giving the Chinese translation a shot and see what I can improve.

    showmustgoon
    Participant

    Thanks for getting back to me so quickly. Got the link—really appreciate it!

    in reply to: Request to Upload RapidSCADA 6.0 Videos to Bilibili #15211
    showmustgoon
    Participant

    No problem! I’ll work on the uploads over the weekend and will share the links here once they’re up after the approved.

Viewing 15 posts - 1 through 15 (of 18 total)